var/home/core/zuul-output/0000755000175000017500000000000015134650767014543 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015134653323015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000227475115134653251020275 0ustar corecoreVsikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD J~FEڤ펯_ˎ6Ϸ7+%f?長ox[o8W5օ!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPFS]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/P_]F@?qr7@sON_}ۿ릶ytoyמseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDAWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Zf_Xba_ ;[JE&I2*N@wdN/S%06 xMVp1K9˂*pQE5kB]j{⛲o.rDڠ5( )J5ń\&YΛ >ٛhX'Nu?,zXv--'y|@me]A(1q=kW\) ˍ4$kp> }"pq<@xwZAgGs;fDcwN8)k)!+T^}ܪ#!8YAH y1^͚Bd =cJ9 9`ʎ+yg&%_;"1irI퇫x1LugHj"L)پ'߸'̕ T0wf=pOfO_ܛ%_o\ʝCU98XO-ǏS}(9uԼ+~nN?Ww+6ub(˻q :hY{[=!gm#k¿A%_X~gЂ*6wdJ&wj_S8.a箶Քr{s00yv3kørI^c!?A}VOd#S&b 1kEw+yF1 wQzqM{,~ATG`Rp#-Kd-x<ӛU_$wӀ$ڤGD(uY pdу ̹jjB&dC̬B̫,- YA2JƉFYFZGԞwBCDoӗ\vnG5]n2D~]K%ot܇q28=^¿Dչoz\8(*<S]NXZkYDI"dL^0wG)]foګx qJЮ}VSUVurMw7LG߾}S<3njZX[ŠX ќF|hyW)4kk; [K3eLD=v-vrQqЈED!,=C]J6Z3`D'BPy{Y _aeQQoP"byBQ U&' g%k2V^[\b $dm,X'xr^M?r4 '735QUnqhU DR(Q]M`Yߤ[!7I \Mx3Lmc`l =k E`rf2>\m?LnkFZŜY'2 W&$͛>`~?4^hD49r7'Aڜ`W1w#)IKK{ՔV y!PSο t2R3sx>%3O ֞:9kvLFo H꼐:Iv62$R)=-|a&N(l@gT%:֍!Bݗ/&8V%)Ȼ)y ~IL].4kmaܶ3޹\b5 99%9ûLMkEX[ROgG'zKW쳮yـam^䒠%k"#ϛ((DzZE`Vyl6r`&؁MNެZGfMxEALudm^Ek4%X^)BfF)z|\G&$ K];HZKҒ5% 2^)UI&b="dڣ߾*I/d!8vu^̻q^.F+uQP==.JG% nN(3>:"9Īx魳oK,/sOM2wfx)sm(E~MLaU?F?rCJU_ҡ(kc]~7Ďysn[v:d y!uͰI>Fr{TQ16ZtaYQ`xEV],ѓYΚ[hy/̒8$1Y>:9;x۾ vK;kN>wSƟlXjz5>xn&fM8[SHY:өϓQ{ w=x`ng[د0 &Up{jgf?W(Y[%Y]Tqs}`VaþR³{BѠE qb3NhDŻ'XRbKU7>GrK_ldb"sdy @H<$K6{)h{@ YFUڛC9CF6sݜp֍ |аsv5^ܜ0γE'2y5~93p1w.m(S#+B}- YZDz EЬkT7&-Nk֭J?NFeƦߐ=t1$뫲,JY wq{؄ ++T @pT4.lqT%YAccprMl\LYn>*3;(`1b.XR3sx>yIx {Yч'}u7 & Y'|6(UbMI.{"g|žл@|泗_>۟p6^9n<tE&FkMpSW'li6wOk{. Ӵ͚%MC}C.Ҏ:gQ."ևxa>KBvZ$rQN )0"R.!YrM.Uj&l2ZjS 2<O)Xp[ۓ JҶjnK:>t]./E1(lg{J4+s{?wyvVM|o}fCj._[BTR9D;WUeK5 S ûh,L!AbQ68r$K4dF#yT5Pep?)ե^Y!X ܣR[mun/~+|ڤ:dF/4$"qla,9Gaգ&*U =fL[wD}(RM$ 7ޢns2 y> OYO>74Ηx>O,% !X )-)t(d Z(z*+I5uO'FaΩ&!<b~\;uNJՋ&3}9513}2O,fb@3׿}AMl ܓ\RkQ| W}^`7vUW:Gn!@PٸQ 3C1QJMZǏ z:9 Z>%b6<[0-V=Jo=B؅#$3jG41 |Ť3*8R]x)nIe&wKj+>mT1KgS* Y_(y)!M餸 K e8v;v^8U8+p%u/4٬.ҾFEKߴr!e2ټO25y:p)ԋ/͆qu :ljt!&J%Ae)ɭ/.lG1+?E-ƳO2o ^xޤw~^c8>_/iJ]^Gm|,@z=\]e|kn^.Ů[_f> y}jePKE!PjFzt) /9&<\LzS'|$q8·fzDMmWI%%W ;a*qi !'%Gz ƌm7ke>\RRaIV'¶6!=8?8oZY|Uv`Ƌ-vo|J:9[v~\:衡pO`c IcjlX):_ EeV a"҅2jB7S2t=t).\aQcb^CZ-uvpr!(dv^'5|X/nI-D!PltsHDwQ$zzBvQ0h} -_>7޴kӔn,?W1;{|,ۇ=sx+@{l?.*+E>1]8B*0)QsU·BS&vp/Χ6I[Ux8"EȿQGa[qVmSІ Y$9F_h6~߮ )ib+q!EXFucYgcV>&w6?H+NL$]p>I*eOjpcm{Bl9vQ.OT!9U}W冨 ;])>6JdbXԠ `Z#_%.9VF[7Id8:W5>*N>KStE)KS1z2"l%^NEN? _4ILAٍKK7)O,:خc磶Fcݒ”^h*_G juİZsQ ~!GAxg_$A|`e)B QlvKlXt҈t9JXXqdl[r9RǦ:q5E](>Z zZ1&8G9r޴T0=Cj,?V~:3] ;Y[OӬNb1{8+7%L1OUaY쎹aZHgi |D `%޻I>rd31V_Sh])AUqػu\Mڗ鸷A+A.A~&'f2*q0âZEqrO| \56cTAnOFo^ X]joC!Pu!Jm l?Qac_>'"Bg<" 0H_-AnG =q޵^Ų gwpГz]'ť$:fr;M-e ՠNhfG8>Vڇ RAٽ9| cbpcT?x]aU {ӋG ނ1v_/EZ~'*.EΑ9U.ϊ/,9怕:[QcUyUrŽ XRjwflѓ6.ܮCy*8¢O[9bu) O14B`.z͜u-ss>Uݴ SaSK§ GT6&l`GT~ꢰ\0P8_)Z]k5>.1C( .Kp| vä+ kj· dM[a^ $H;M $YǫU>?<UݸoRV >IsawF\b+s~p"eʰ(zZ=.!BjѕFdpUna"Odb *75:&C k1ͤ#O Rۘ– Er/G/UcAPQT]|XN X]^F Ŗ:ޔ&+@,{3T\X)|*HN'e*h0:VumBl ۏ `9AgPF_Ѫ2)sCj1T.S0Z1:?Wy9egI+bK?&#I3X\WGZ3M`AI.pH6xm`Rs _Jt@U8jxɕͽf3[I3G$)ՖbG(}1wL!wVf;L|14jغRqcZRݹJ$]!:YF2cU(7B~ ;Wi+vwv-_@q)+?Dobtm4Sxb(9A `aRs ޶d6'XA5?V_W puȇ;s`uih _F2% [U۴"qkjGX6)_+(f?\T)* &9V(]"tJ8۷)g3J1n`ROu~}#Ѥ#r !J0CQ v⯥ho1=V T:_#OV+kG\8Sz^'툜+OqFǤSCǔl X1\1:" 0mtH,>7>a.fpU`ZR֩bK'`tTiwm* "Qi+ *mDtH-ʐ?sk47iIb3Ώ%TCv}e{̈́=I;iƊc2J1TN>7q;"sդsP[ kW`u!8Rj.2hgWsE.,uDΡ1RºVݐ/CBc˾[ shGI 0Os\l}`pΕ qO-ˠ{'\ QuaBn|L@drVec>$Ȃ1L-'{뭄GdɱL ;V[bp>!n&աI̱Sx!shjuL P Ӧɇ~t#K1pVi8F'+1dc&xF 2侯}>tiDpU`%7iTH .Y[L'y}Jm2$EB"{3cMmhipEI:59sTz?[uvcD-~V,.ȍȱHEB:p`\E)jlȔa|)nɲ"Tq?E8V 7z[v_J~C4>''Rc1-V RtzJ=sۄ`g?7̪ #`u0V<s)/=rnlg9| RD1౱UR}UR,:ơz/lvc& GHwMlF@a4D Oj!;V|aq>0*s%6)L?$ća$."T#yqHhlك&ٕEt_$d:z2-\NR#cDB/eWzH 1Հm -XıUXFr\A-2]6u/ųz Z?ڢV)-&!8vL f2D?#y8Vk[~;DSu>6nQ qf2Lυi l-傊וzF"daf]>HUd6JG`\g2%tJ4vX[7g"zw\k>kqBbB;t@h)Gရ[-rnl-wgpn]Y#&ߓ_SGo_&AJ烠a/f_Œ8aJM6MY(3ͯl~l8V0٪T zL{As:&EXAn 8Ugݗ^Os RTnp{PZ )`S!9| Z*7XibR${˪AokOr\y$ ^X9@:]ioF+}/ 7 XYJ %k7A 4x"9,GS5 b]奬kvc(Ƭ"uSxO`*` '/GbBU\D 8;J"U\y6^Dƪ0lc)JYUIt[ǪBRĕȲ(Jߕ66@cUJ&B.!$xF UASVsso*8;n/OyRH?1Uaj/o![Gq*Sh-œ2R\EMB%]x+2}wT)=dq?ku7u;%qTϜ=43, {ޫ2b\;bI3dVq _L<xB0tZQEXd#L[Z,dF,B? ӴlG0yln["nF23GlOҼ(.0hx2Q˭gp":lMU&yȓYg-Ȭ.yQ<}4|8 y#ça#. 5󉦭XJP8DǠ Q e6x-Żq5)xSPrR-& *z!'}q5 Q;Uu/\ɢ~9#}"Ddc^c}o躩?tG_甠]2t ĝ '4gi1O|)gNjs={sC\qIx1GMSkdYoL'_΋,a,<.};5 kn Ty"Z~؝R:aSAoIדpP/|;_'PPxO#=L7i6c z尽m|*)Q 6ɹ8 $0`NgM0?D< S& }rhs$@`y?}Ӥ yx"tR$9p҆$@.˺lӴ̱D;'7⥸RRug uVw˓h$(QWAaO V_9?<:%=DLȠ 9sjxO,1}ں<=NƜ B} }F~P} B\~Wbٌlcݒp8[컭mt~XʶbƓ?~jLIKٳٝTZ+gۻs<@`߱)C2(ⶪ[7yƋj< sǜ < v^U͑3(v[{QҶ7}Zz>c5o'ƅ@󳿧OH\WA-Ik)&x$q#iX'@Ͽht6͛Jã{(2CWP,_?-<-Ӹc:MT b׃sk,r/IEzRZ;^ѐ8I^c-ow\۶M#z/!~gHɢh! W bߛe*.qSw2+y&tߺ oh=I&g44mдʊߜiM tk QE$oDUߣRToѾ/Lb0x_`4j7B[0*:!ʭG94ѣs,gQYTԞ8Os{#աQrRezC.*/nh?nOp\o4Rx_OhF7^'R{o4{H,}%%zcC7emT AnVyp kxe-P W\MY PXM FDRF,#!#$>~j` Dci0Ó|ZYuMaNxR6qঔZ]򬊻zɕ)AmN2(3Gk \ p J մqrJH'E|  cDRT[wq1"a+K zrdOi=c +{TqAݧnEJ!f1e7e3ef&Dۻ+/<wwll=j ~W/aN!Hdv d+IMŀioN)zhɈ,E@VPTյWƂ瘪|{_~RѴ<1ϫYڕ55ۊW n{Ӵ-߷i܏"^$|+`eG,7"0u=+ L3t].<݉|AcW!h Ap;8櫵fJb DRN_'>S$zY7 Tlhͯʹ r׼ڄn,][EFRI~5!k\oJ R4I}NWUzl\ӾwO& o$@Wd=oH+҈iTgTVĺJV oJ}{H #gh?;Dv#=<5\ FTўnqp*g(,B9-*i{WY7eZAZ*]D@ڨ3|7:(S0bX "wT,csXp$I5wTG0޲f|3UuDQ%> {3~{ d@:]h`iU1^R0 E]ܘ6_FGҚcκK!{HI:)6 H!@M<6h ]Ob!SYuyM[?aZMihcT}];M,3ѝ x۠V5qNIU&$TU[\]]A:VEۡOi9'Hr ȫ~C sr*Kȫj\AIeVZKw*Js,$U) +}򬹶m ҰͰa/|Cg;1Fl p7#׹~W(Ó4gή0x+_p:b'!p?֮gee7nf2B-SwacL|/~y W (c0] E,ގ hكFU#H>3Ppo\|̘ƖboF]9.'pxC8O}p;?/!cuL_iJ+2ۢ}cA%1y{V=m8=$I!WT#8"&?04M>~]GS1ɠU{ ~5%JgZ=}ZcITg)ԻGm6Y?eiV~sNzm3ToEȗuzL[~o5ޖ\cq[S[퇷{<ߓg򯏧?Y TvkEHю铎(p[4DZJ1P'11λ@G PvAEݰfvwPnJv4 P L,[E^7ŀR D 7(u 2vb(sFlTra@05`~7<Hc]SC ahx x6Ubp'UB ;X"7wOtpL| B3 K9Y)'.@d@ a!/u;EB@Q<8bv ܯF y~ψ%H(p3FaGsey[E' %9t[TE,Ah(x~%EXGD(I-#!n#fv\G+E\f~{FN"3j"xiQ6ul8P/#^-$x6c6 cƮ׋ Kຎm lN0jFQrF 3@&4Bb2:g# ,7`6i+@̿Dqgi!b&=c~~ Go; *A4y]U%ͫjzV \Y #!pj}jv%VE1IɤvDHB8UrLN|A>YWٴ}f'::]㋺ޗPs͖U'+&(dN {:"Qfa,@"woѨM k F-}nH+oh.G䚑JRL7>YaGf0HW|עPeL0PҼC; jKe=gD @Re ZEttT~$vcxہ s ~_U{ n3j-aU⏲)8P8aMpIZKB}Qx*73wu1r1=(mJ}Fr~xp_)a }@xPսB'[-D X}71Nj4l`ӶXp% bŰ(B1u<_IVVGX8C~>>@h%] {H=Ggt)Lon7D_HqYݡ9-e̹ f</JӝhG7_&2*=a:=,`~}yRޠ\v zU_F2!x oxja!e0u!!㑗'AL,peQzM̳]'v;Haγq(.!Oz(9J큛 ` `'|j Js㽀|c52_DlŠD5ף REh9o ş/*GUV=.,pӅÜ~4 xqWt/zq;{b:z:W'S68xT G&+DbnQw'UAy' ll $&Qܠ[Y&xޘ籗40{vŖp(0@>x G6{<]*&Yh_׵=-?aygI̳|&i G;mGY-[٭*x ;G.G1 VAvA5GUI=ƊJ{>Ls$h+PRЮWdQjMR//J`ߚ[Aٲl Ae#.e){F {=[Zzct}]wei5u֗LT癒:kʗko&(| AeA5uLPk- !fz[CPYP A7˂k/h3 4\4\Cp}A )hѲF m&hLA.+B&HLʛ YH[~e3]^yިIygp,n;0q]e:蒛bvUg$,ׇw2 %,$q+I,@WKK1Gk@Kgۣ=^>.m_.z͒bW&)tɌFUQ]1ߊ̇3Ⲱ=4'SQgSràj1?O'lN/o|VU8%"iU}LT= Жi&$uϓ(c u*H"4W-Nm5]H=S40"[ouct:G{6¬qh[WSH9#Ogd@BVdRE}鉶oo^Y*VW4=ZҴ ~"iiK?|8>õq?Ê(^-zDU Ù%n˚XsJQclä4ZΠ?̇|}Nsad- Yd;ݯ39P:U>Ϋdd# F=.X\Ĵ$cĝ a\v$}ǫdOMFs!pװRx(TJ%@_Y4OGI1rJC#wXo@6R=!Bվrc{.a~j#l {7WzmC&HjѓtLItn4* ,?S6R hLaj]CSd/DPđ݃^Bz4a#ӱwŸͥ.M ?BU~Y Wd ,N{5䳄YD2$N ~_.(9Ƞ.HO,&eZ"x&TcjD(Z"53aᑾ6BVh`9P@јHI*tH9 t Ppq@ep`g چƒ9Q hHw_o8Z%3$G\~ FqER.ҥìR-^P[ v!9p7 i@Axm~nP"+:OyKm b=z/YdAyxP_K\Thxƛy*9Up$"If"+e/PMGxwaKc2LgXHFf=IF:?±|{t?:QaҞG|jnLWޜ^EmR2/|T~CiZLV/|E76pbCuǠ4e,080kk}?}φCD@)xݚ^zWen TP~9MO&t|"gRF5ky^[{ᰱ쁃e3`ڗߌn}g~o)l{@ﱫ|Z!Ҿ0f説gи i!M` xx: 38XVLn*C<#T"ajmq4Kq25*ьgf0Ɛ@3`}g6X}_B i h8vZR${X*lⴞδ%rsO@[\? t>Q}eZkiCw]|0$rqL9'v]8U[mcBEw8 6w-F]s›A2]X8;8"I3d| [miψuOM$ǹ/*2RE {"Ҏ" 3Ox;C'Z{J_gu ,  S5$-^O%)N CexquT'w$p 6O3~\@rzͿ?>U-?T[-׿|WϏ/?³tlZڏ7K7Ÿ|'7+o7^ k7|@9<KTUĈH~: 4W4%{+X҂3/B Xpꇑ:Qʹߖ;;ږxQ. И p# l##nln1\/I(‹o] ;6=ise#;{8-$ҏ V4\ھ\-ӥ1SE"$bi_<}!A0 F0q,8ĝ ?"/hQ5lDnU"8]\`lU;@ N$HUxddVpW5S:j Qt~ @Qr>֗4U\^;Idh$ {C%@j+-!N2F '7:Ju%cϴK <YvA,'AQ[4 6\E5ph@C2/Fr/ܫΡ6tnp*K%]h)N{O`/c@b+Yj Ig7^o)p3[}wi lebl meCva$˭jHpƌN!݁bpڀjLVYdJX91bN_HӃ͇IbzbQ )+jFLc$QQrJȔ ك+Mkd +F٥xXsMU(8jkq9z-X@U#LfIp~QZo8_lH _bZj9KBז(}h%eH:SrTDuՊL!p/pHz# #S+%g9Zp~K{6n+Xa$b3E9N*6ˢbj%ŵw_vyfq޼t*]T9mW5͇҃6j BXYq,W&lNKz!QmM~bi O1SFl*Qg:̒}.ؠEF1!vޜ]~% RTN -Ay4*kLsߗ1΃$Hуr}+z,5x}ƶ!$N??")-'jl:sUԻM[,,KѠX ш{#GرiOsvDBTV{6$Jj,Df{\5-;GߓM\ўrRK`˴0f;#9};OblZT3l, ӌ^!5huEG gUdLdޥ)Gicdv,gx -ʹ Ɲ`{Mta'MAdZOJᔁٌ iQp1luG*4A]m?^mZԊIfA{l fS9Kp]YL#^XTl<}֏YWZTiF)9-MV&YaDHX[0 1@ju ft({?>g͖>r ;.46IJ(^s~-r̈>2s5L;zb C,Y =۷F7lP`gyR2.HbL= ŠWmSz^*#7xR<,Iš_z5󦀨a!V˱. 72)hPp` D%| !CNc$`PܢVZ0sGTaM7)x837"23`3h /"$Qx\3poZG- Ylco펤8_rׅB9 Ģ8{f1=uŭ$8S&X^ٙh෭ #P|cn ,sa6Մi^x-I / v / *ަ=NJ{9!8qYB4v! qL- g05̓THtfT~S5yť^ʬìrsE5ؑ%o7rQ;]yO: '{˭#q#/^jju ZTC#F9N'NgtZB h@`$-/{Je >ሚ}z-\$8N]ʥ>Cv{N|^2bpZ j}v$Y}{s+ޞ7\5M]Υy.A$Rz%ov҅=Owk cՇvn}[ϫjX{zǃ.Y|h4ȫ Po?;q&R^Cuivh9^S/2(W"Z-p=+-BqD[z epޤ~OwZܩ|)1"' P!,+$?1n'1vs{yC;i%Z⭹ܓLhU5m{PO$8jriS+,koV#g{NRigz2|\f}ǖmL*8%c0CH)x <{?؇wHpw9x̐.w})3ìLsEvϗD! ̱y7Pg?cK\˱3yЎέBOxd@uDE0b0ݫ^Ip9]{,3|HNZ^;x(g3fGpBSi Oy#S1l<﷛5X!Asx)aA_mʟI"M/7p A/k9.yLc$lmqpGY0GKZ^u]Cf@y?!윸^qi|f~!&L*G5X n|6~'?!oϚHTe/Yp| R<h< گ*_ACx&c4mԼF|!:R4 @qtsq-72ő|o L/LQ~rU4iNoRKTUxq5S9A'0Us߿ЗBJrDAle G7LR_̓8_%yc/zyDjRLrW(,?>; +iv~]0wϼ<޼ÿ,9 @l s>Fc';rU|\Q%zE4Pn[O .t9yS~st&n*7%,t&›?Nqv I ̝bTNW.tsGywVbkqɎٍ]N>p`OJ;)1l$UxY*8-|&_,7̇T'4վ) 9gSU2-GmXfW.Hec&Ou \H7MӒ"ȹ6el+4ќ<74wggAR7fqv_Uiy4Y=({*7^O_.Xdǯ~ȔJO>OH.;}5^T p[3>,ĨpY ԙ,*KdW( h$O(O3y;3-4:hƆIw b4y \uD) JQ.m=UPe q!kLYת޳Xs*@Bۭ2eXtf] 9BHzmqZ)ڏWX?H~-"txKLUA`;`rPsD1ԅ![YB(;OmͭA"$;d  SK]@05ҡH"S '  (OZؼ)!+KKuIۅ jiM6 f`: $R:N-yd´-QfOp}ʞ)F&y)G3<1Լh"W]5`^ -#A77O迄 ޾_}  d^Z\_ 5Wŵ`W(#L(ͣKN -sZ)MN𼑷g랒n׸zxyY>1NKw [VO0H!b"Hޫ tp+,>zxƟ`bte(Dө&DfTXT`#)Ej<"!yOXD-?`,$9 5dm7<gzla~ybc# ϛۛGYsAp?| _1uN`qpWJ1fTkb8ebJjѹ -O|w}W8;_TIuS%6nk6Ԩ6m|kˀUvq>Jzh~{CNV!I׃)G諻i_@ m  1W]AtɃZj xiڍ7My8XV[Nԯ,upaPUb},B$&ߕ&}Ao_!KH ۳ӵUS b8(吡Lv5v@ dD"hPקBގ=EQTl1PA<_٫^J w؝ W333qcjӨSؽ/֕=t|B'20b]Jd; 1J"Jwn&j]L VH"ܬ,Geb $w.W)oq|54F&8AZV@9fPiF;NʐxuwCwhCt'j]E͒#-P'6;m+)v>f#F)wAX{GފӐgڥkDu15ru.AVD5CS(Ъ[>'8Azw>#ƺS R",]9IhPc)hp('$R1"46PG V3B (߆[p-z뢊xE-:J@ѴIڥBI,;B4;5 zfTnV0#)lNz`S@T)`T9P?x M4mZZХEi܅r`Z56пHӛpʔޞѳwQU2e^Ca3dCc$M_v.>P%!o&aҼyʺ-# ;? UT`]ŞQ?(33c͜SlYtK{ )J 8Qp/U̚qܬ x?SP~ʵD!Hs=P'͞kI1㒊)u9/\-Bd:jg%ǜhZ‡'NRjH[R}!8AAFj9Ѥ%iʛ+& F!YɂF:"5)8 VCjG6|`N:j*rTF"@dHz)k#7&iC6|:2ĭKOJM+©L /U Bt2ySyMzUG-6iaֆÇH N$b &Qlt`!"e<\DRyQ(Vʀa4h0233^_㎔.7FhL[ J%CIWW$C2ݬUyI') +Y$JFg\@,3Cp5l2M Cu4Q3!Ūj-i r=f^e[WԨ!#!iQC$Ƞl N@X䬈hg0u](#XC W{ x\X aNcBrC`Bc:7R"z%Bv$.?RVefs!4 {Gt: L U(A!IW =7@5D %!e#5NK00zi%\)M Fa~tIS;M#X X'1,"VY #@6 {t-RGMՂtOIւe]_UO\4?NmO{`@ru&tWiVSVƬZ=ٻm&U"/i$)x km-{"R,{]˲WV`J9C!3Vi%EZ{; t'AITHS,c,Z0XN( H*V($H&F>߈jJҶǾ'E ZЄLIL,uSrBGmBB燾)#h |("JBHc& bv.D$W^`j+FdbOLbG5ğ4hx;pA'iEG ӂ>EP -Lz$\#0َ\sr}&w'SMq /S#xga2wPǺ];|NZѽˣ^ͺS 8ӟ a>" 2Dn|+#7ID~JO( J8B ya7<JDLݏجݹ^2Ͳٚc7`>^`-7[L\[u +՟`a(b B" hF:"qFbE-ieW'Nr7GbƚRvсWAޚwۭteq?>|x[(\RBc#w h'xz2Ko/ 򶑿Y8c3y;0W`Nꥆk%N7j'sM ZmT׃|Sp+=Osn(z hz1bT T|+:qoqXڗt8OI6zohgWf*&LxUKOW;/T!#.FX7Lc$=69FjO8@uh,EǑ_Ԏ7j@T8@+q˵_wYh51Af8gϽdwC{/'ߍeTfgQ8jȹX͟[:]&5X3 ), #ÈY0ł:ivɸpvtoY+0 Gwn ~ }a *1{01q>+#0bWٳ '_3obpn\迦/~ʳ6-~tujɳͫ6#*Ab _};"T&\6EMb}J•7&\Y?bE II!\.^n1rz,>$ڹX: 'w Xy ߹ٕ^+bfQˠe_d 3K_VL5j m3LHZ ґgo>3J56ȳ6-ZumO"5!ha`9z:-+ı r/XP!qi)W:%ØBsWn)WՓi6-Dw`~yiR-bg]d/ oɄ\H$>ߍ\S~؝qвzt',?Nɸ[W`$D{o o8_;X|,Dꘪ Npuh| ] xa]Xf/>Vqjkn֧?‹[':h8[l&S~7|BD!cI%*l&_)QV2&cIҰ%_3$Ff^b>8aF .oR/ qo={c{ 7& q ҩ )TI?s\rp%VOW|6MWuTnۀe-mZ*V{5rҬe 'VGu6.޲\Л=={^[7ߤsWdE|n.fӬЀ¤[mem(cMǶsp#ȼ;TXso@μ4O۱*:i( )-| i$cV/G(#[]Aֱ8 C?!ג3?RhJAH@CzEUq%)rv)-tV*x}Yi?I1JZ/Yߑ_JvԻV^f~h6oGGMδ ?]nOX QX)Nuӊ&޽L8jGb둥zE3x֊|JDBI+T#y-8:@C<@")9l7QWYVrk&?"( ֦ckh36^O>#d1`W)1Ш˘1bb}{n5j+_gƖNGWj~wmߦRw'̎* &媅]aŋIgV&A60 {c#z,]& glsq/ȺY~J:^)O˶s˖0~;otMJ(s{Be*$*K,;V}f#a]7l|{ov.FA@8B,F F2Eֱ@*qĢDXؐ Taɏ#2KYMjXC^>g1aTotqOPDm(]LbgQ=I#`P6–5oaֿVSc4Vߚڿ:l Z\F.edٽՖ6 g^Z= 6V,(OV>e%t˗iBS5~NrŒh17]jYWU[ fb:ˆ pX|;uլuR@ u JD*9L|}(7>=rx XuLM/Q0^Xޭ /+ m5O-[ cZU܋k dto议mtyT6"TSXWқAzpFlbۈ{_vFxkr Sk0o'08HezT5RzF<Z=9bw]{|&`>Q,@!Q?C3>ͦ> 9ʃ(C0TbH r (`,Rn,͔b0.AvygOއ06`yyJ(6I$E$( BT<sCe& +ʧ=+cϙqN(!T)s  hW>缿WzT?vJ;;QW/BF; #@r SghYǽ.woP6AxgPYk1S #9`!![O9ܚ}q<'_Tb@m x[{?c2ŔN"/qtu-,bA}f)t;n=Ll챆X0w7j~FO?WWPSgU@•~㙣o;F+K '^ۢ^]Uwo8]imy9{2›loonnW :pH}u_o㐋7;HxYگSځO_鯫ŏugFkw?I)s"J2Z8L{Z4_? /v/&~.6Ws"g9)RȈ/ ĕ$<G>_'uO{87/~N+ ju! Ҽ$pF=#DGzd!@# ѳAoCSGM _K.)]-U'#<E>r'fJ=ÿ@|׿Ykd1T+,6b̳qc!*|^8 _mz+4 k[%# J,z]=)L {TEg!25uN'kBn'2(֊준1jR%#zPu%cv6xMGʨ n? %%*%,y0'"8i#NH^1>y/|,Ngf[/}ώ' w1 29X='>HJӦHiM#Vqcv2%Gm..7:zh# 0k .9\V2*8=yWԳ>z|7Pәqȫ:'reEJLᖒ2 i_ρ#:BklJ=^ g'f$((_>p8}]E>rN[̀i%^ԆL'(yw{ڇ$j3G1Ow0ƑŘٖ<N쵱>!:C&GI D誁( =J*-'qse&buX|u\2Z yӳb.B19CȨ F[❌3Ųvfq#al{msAN.HR$B0O2yЯ1c!A~\|? i[QA7FL8mj\tHR!SP XQx,I:HI!xA)CnR7Ky K>7>i RRĂEN$agb)l2N:GBG\2;?sFRJ$`Pa'.ΛHzd!@# P-ɚ9`]iYXӦ(&M1vC( Cw y2dNFPJTREUR"N5EWFL  \Q'HɵxCvJ3|7)B>f&`\V$Ƅ^L: LxwZT鮻E>f.*Ql^RΦ+@rKh5%3#[/Rn„dXV!i,)L*Lg)?"C>gw \!2oC|y~۟/ۻXB_9Q w9ZxHo&<ʗ9B#뗿V;w:^Oo~yd#?FiLǽLU1N_EU/> 6 kpDMjґPM*]R}̣/& QOܟV/Rmy.m9LoPnQg3a-6}Ż@132& p+phUEز-4`&=2N^uWE>rKI^qJ6U}?(Yt=:Ѳl|%[R㪰z$.m*"( fQUO2cxt Ǒ^np,vkt9lH@,G嬶rOq 8OXWn mVWm \6D d}-F<ڭ-FXWK771HPDj[!@7RHS n 8Ai 3 X* o4fGFõɬCwm||ʀ %')k-%$ qR<|,bU:|oą*U c!ٸgKzdbVݘ|$~6q"8b,{R˓n 5O{B.&ۼ]W+ : ƃi)i0$琭gq92w xXg6˱GFY;{p`rD{d4$ja3 BY=l ñ/-kGam7Nx}M0.C1tPN(X\'L?Eu{vVN~痷p;DvhY~ծ8}Njvq a_\qƽNM=Pyo\v Z+_{WT\9QLX .RsJ \hb|q2<8|,5Uq4]8nѽo-7i0|k/7gsg8Muphj>e@Jf׺Tid GB" Ѓdґiˎ#3'|Hr@tQ[d" P/]} 6,T4fo݂q my; eU0U$):)\E:'h!=2<^,h H݄3.-v\ij7EKgѢ ,i 5nXLjIBLPgēӂF.h8[?$H 1äim!x3]PcV7 =_ď+8`1@Ѳ(܄Ϝpx'6u}`P 'bn爛qjIREJσ5A:aǓ]Qc!O |T* %?d.&?7rͨw̍] [8.,lҒ) AeQ&^G_ ( HdqAޯ7 pte\߲q'ti|tI%ݢl6/~s-D̈́gG-w]gٷl| ٍj|ad7湮E>9+[mSGF˖5MB$͒){lcp"ߺ)xG^xo0ֹ A4ZymS*}~BUHh` 0O[ [eOXH~m|]xQ EMd(c>t,9suNs]ү>4v 8&Vۛ Θ9ffa=2y01#!N[FַE\ fn 4yJ!?PzxNXvYܬo73`*|%'Gzd' RR2Wr'!@GX穵/|,7n(!Pkg: ]U3(0 d xt HA%@ֺ^n-u3fElY1IojwBϮZچB)(b(EYԍ!}wݣθBPs] h`:&zd|j=_=oZU*UZZ3f$HF]QYz>IՁuuBdK/Qmm# 7:4 MDb/l1k',@;vfLWդkQ-)n³۸&aWGaCQr s'IpzM,OdNTgc4;E>m6١:^q:᠟'z?lw7V^9*3︽ xK+LJMtzNCu!n]~h<#!loϽ4!Gh1E'v]O6ci " mIa3cO{+P$QPa ALPNGnE>[';zG0`b[X_ad"PrM>A䒀S@n@X' UZ{8i 3o ko&j(^LJOޥcp!-:o\mρj{|)l/G jDq/o/p4Ue*z@,bZ7pb#eENO}Ip6,)=ɜltqޜ;1u#'(k j0L̄{'χ:]H;6J2:EjQq$~\l-.hڂY6a,kq{,&z*p՝$Lc1M!O4)" rbhIBTIīzAh)&X_91tDEӸ8^l^VoO.>pe7sjwk'-87_Ur=rԍ#k \H3єK:uyVf]x!+uWCL OrcKZGN&0zEa%΃!S4[fz}ݭv_lWp;`%ښٍ_ap*?8CbW) 3HSX~ٻG|[& l:=ti*$c׃ zç$ʒ|`Qc#07lrEJ*#oՠͯlͰ#bq+eb:F uyQc= UݽGX,˗XޛoEc!ygSʗ JQt1R&W=Cĥ^T(x+huQTEe?]zT;q}PrTM QtݺputB'N_B$CƱ)̍# 2PO"f2aDa\L =$;ќ:/I{pvI!cRJ;lpt>c>Ծ;AtRoUQgEVP#a|Tc#n,&Q9r6ǕuըOj+po|Jo)ƕrata$[:^=z9󠮮iz?ɈP\Yla/$}^M%ERm^BpO4S`uz$׼+spk7Ӫ&ߝ`1dΝW#sԴ?~Uta {DJsr:[\M/kr#k_>p J3>۾vCR]Q^`qk'ᘡt=NMătu\9BO"Mof!\8v$l-ʡ#pu?L+T<:8$)m*tKHsH֋Ji`9k0}c㡓wTS|nqO!cIVE#'cˑV+a ; rfhgaxD~ɚLjU~.xeG>yϛlg$28jdI)T@+^&Z`l"5o3 "P܇G}>,sV,luA h*JpX·pꜯBɥ4GBǚ|FWymCI[evI jk[)C{mʾGcO˪vvEx} S0s- w{C&m: ׹Q~=<~5Y'8N~_siߡ |;IjZ+nDQ꣧{@+ц:;b DEuZ^}9;}w܇Q6T `Mc`HCw4}pgS:* ˞X=WԤ+wToaa"c?鸡CZq=E h 򝓨ۡ|A, u8\h>s]XD+2%,8d.U P0sY6ayuҢ0yL2b<>!?~pw+26S{n<C܇ݡ g3 Z|I؁"nPu6^m> C5^1Aj3*]W4wo41}~JM=\^_C[B]埱A0.P3XSVTor{żbL[Rob{֌u-6(l~LFSG0Y";3GC]}n>cFALb\̛<^8-F~sF)1|!BPd8^>A瑗^a3?S7[oob)1ٝgnsZ mET`+ DQZJKSXu#Ve 20hD7b (4W~TXം-W&TRz^lo @Ì Er~2"T@# }=?`geEeb! Q "Uwa%rX@(x~~mg*CJR抰 |2UB9EJ. ޑ`iW .=ơf#tA*KC`<bD1t^^bFgcGr iATѓr8Z 6ًB flp ˙_p/xc>#Ov ϕ^9H:Tw2.(iJ#F \HKbv* UGm] (ⱻ>~a6vrZ,Dp>4lj|oCsCӌ&;&a\ahB22*)DѶs z5ob Dc]# NyioU\']maY~Co2ݳt#zTva"Ay>A7r@PcaW0152t,T2_!eIQ*=Chu >*iϧ#)v㶂_iL 2ƏhϘ8F:JGRӚ,i9=;? ')BQ1 jHTB*^UPu!n)<ۅw#;*ne*V R5OL8Q]oTŇvklӭ0])`0b%&= 2*1\Og!„w=SPF\6tV"ژ65hCqU2>E1"}x[/qߒ"Ը igiB⨳Ls#6߯ZePDٜ d w= n{"j$9 UQvCcDZ=_ߛ;,0\{e`2ka.aV,xF$0Q\q0 Cb,1dC!F?@c%gE6zbWhDb ( mv[u2jӢikf}96xۿjlqTP~=}}/B9Ȃԇ/ш6Cb cGI:?a;DYa261 .*bmOKEٸ_{TXe%yb _r*G^ް(XyH:P-Yn #{C*RRc*sRqj1s{[/]PA7mI7}LɎ;&WZ?gCo~!*FAN}vTSRyG/f  6B7+= jE +*6\d3i%6UE SH +VBie 3b+AS`*(2TDEm*(v|I_W>k^$6b ϘלD($r#)D2VIaS]/f,#x &L X l:i\eOiy*EVK(j-sV*!n/ӳݽiX>P̿Ά u n1т630\+|9zuf+DP\R/abД2\|7ƙU*z]g.z:"?[M•D/$$/{ *ҚN(ә/UWlwɷ J% qoKt\"⏌%Pgec˓ghjmS1Q&ĕGܺ* tߌQvfk|] iXo6_{|߄PT_Ƿ#D>vSY>^+9EՍ߼sηof/>`9ST# W/CF5bݧecG99g4"EU$rVF+ͅ=0#\i"}Ӆ~-aZ vQG+UT?ePg?*ґ֯aO|d [Y|*3`@dlXWX[G O%A`.%-GO}6۸7>!?DtS2ngvBi |?Ofjzz~2]ź^yS(1q˗he ZT N0o\JO6i[6IW aR,O[LV=' 5| ,l)ׄ2Iy5L㯿<~rQ&s2Q>?VՏ}NEZ}ϱo47l,7qr$nj7SsU/&/4=$~%&W}iX0T*-L!Nfu=YT,_q<͠돜Zf,NI<l4(BrT%j1#˙]uI#ttdzwhBʕ^͐]ov(ګ[E[q^, yǔ[b}|O _0ό!C{)je& t[5%qSKs]{oG*qi].EwO5EJw3"9dK{hgjU )zj y:؉ ـ;Qe `x/QP$ {a޶6\I#HB}#H qn&\|;!\1/AN3 cZ$0Py`Ù#HA 4_RUJlU>>;8EFx=%L# vEcEJAb nPe.=Դ^-ڥBjZ7c} :AǬ*S#+="FJC*{`Pj Bܻ?jο7!)!3?{7كv7ceqFܮPF#Rİf,9õ558cj9RߚI1G%SUp.%fq0 %eBm0-'s^kMq)ĩm1 Ge]7) ` r8Q5uw"U'Ӱ\\?}?7_]-k1LYu5)kf4E>.f|9.JmEЁQ`!|li'0o&q o Y0S0t op: I$cG]'c&gO̓UFd6B3|߿y}xڏ#ѝ_'ExM8f;ȗ, WVQsv]ĖDdtLk: e]Sgl݌Ao1)vspgEц:{f Q!r*(vD31yWNKNѵQֆ11 ayfq]>ږ"{$W[F\TƠ$Ȗ'!%GΰB ݓ–쿑/yX -DŽg\Xق b8&+!XvZsY4boFi{=oN].£qq: 9b^jӯI_J c).ZET% ]ŇZR4u5v'JԻ1aM/_:<),vƓe]{z+-F"k]`Op~apsG3t&ޭ8nRM,njVvUq7lYM F&KYJY i,5!ǨE`OaŎՠko]')Ƀ/Tz$w2*0Ir}j{ln tv7cmPtGOn(QKE o6-Gr)+8gwI5 FVu |Q=)F1x< =v9]ߩ_4xuуrt Et5Hr#h$<ia|t0o2p-,Aq rTRAԩ :k=))ߝ&\KðL. `yGyOպIa|=qubj\ y {2`O,.7YXP([-N 7jP萜`B  o>{1>ې!\ѥ"%Kп+rgT3$ZUj~{![4+CcB٭1ةKR8OHNlK,: "E(k2t-x00KCʘ,)[|Q}gpiγ+Od:&d$8D@{AT oarPGB!p SJ :!Pmk+]Oj]*Xmi{=U k;怲iPJ9FMW#5.M 7s ›d~YW t|co\72DGYkF槺Н"5二sܖ59Vbop:>ʂriM`0d.(-,\c u }ea#:us/"bCQǞ})oW.dzt'7^檏fZsa4XUh2Rƃ䓭FEr?z5 r?A7 .c`7;\LNć[W3.8oUJ~źmW,/.3=*yƷaFN,wod+!e{XOC,Dr<{j$CߝCAE 2'1.aPAi!q%O'~pYv^n?4QN"S@[4-2(86?]zfzʔP_ʔBji19Ӝ@I}9$φ'/-Vo"]4M2 D/mJ]ɑIr +p*3:0Rh,1~#*L>6VO!F.aR Qlp YW03QP1ekBGVW;@m<3,q&Q%^VG4b .R0m#IɏůHR5*:6:8InrDhs˜2DB\ 4_R9ꪻW )1G9*f$A`tX%Q?bq?Gg .&)o r%YlN7= 0a=34RLɉ,y)9bO1Zk_RGT%p͉vgoWi:[vW'D!AL%|/q9U40q_HE`g T1 6y)Fp)$msEJ6W4+ZTsEӌ!3rfjѓASv*g6$k;M] ц!i*ut->]I =u<[h`0kMzuzXMhII?|<[̤D%g n 5FD>]lstw35)9UKпV+kߔN\I35)^wzS e߬$/7+fes>}7+ -nVĹݬ7ZX|=wb+5Km%*rST3ZzG &% O rJT '8 –ڽ kak} ?FcWH} [eS?}hXߢ]PmWY+gw. bpijxb҇UL8ML!&(Ũ9¨%g4YNX} ~>@ X S7^M}E嫩Wҡh^Vm~x1KbZ KzЎ>SLS~^6IlU@۾DZ8Γ p)+d"E%(ֱ9 JTP9u*6*vcC un`(FGτV()5έ!9xpS%.D fV,\;( ˤiS#к{M.3^|¶zH؀&&yvʅ;Y˫"+c_yԏhN+{e jEYWk~[)LzlT\t&p.{ D~=UNQŃz* x~d@A l{w75`%ԛۿV@fQ H<S.H[?; xgBSԻŲQPh^yb4Xq `n}(\``77Ĩڀ ݅a,&&/C98v(iKC h % WVE>;eTg\1Ûg;=7eG9r4evn cūv͗)R yM1ZT6.}TҦ(qFcp7[kC;̕ퟢioKa%L$RĈ ZgH sB1h@尛{ W(g.B">sV4)8Lc!XA-m` J#VӤZU_xso/eO V, `>ƉPG4Yupn#xb`N#K=A6q<1 hG9ymwyƯӒ ̌(Z cSZ9n hHcxzK{{Ӥ,)dkIA9N#*"5v<;83g\ÑJmF\ QmT>a~}'b}oo:j9ps@a 3/?5 to IJcVR>Z_w nss>B+1h4M#l !E#`ɛ秋D%1Ɓ#y`HIaT cka9|osp$r#s"%>*USy<^"E2.7E[P`}B}n~n6-s<i)" FqHtigx"ZB*GFX;NHG yWNK9 pCd1a͑ϗ٦pt,߽2<(bVRo"DchHM1ӂ% C5( Xw Zu Q)ݩQeG2hHX`ZXn=0b(on":@18t}NsN.;(F<@18$!"SfJViʰvGL:6:8HFlߙml|A9 x^Y8P_/}| E ["8(09Cm7!I(M6׹DS5CLcGY@4ڂ1ol6qۻs]w9F8Y8NL)( ?2{h4%-@]ħM˹-!0=H5aD<&%\И\!RTJ1v 8|Gdz'OJM_AwpQ R!s=:*G1YYJyb,f o{zk@t5e/5ed,Qi&SZZSҐ& iN^)Jr3 qe=W4ZSA2EL(jS9Cod$0ChFcp@gt0i)=2*3au2G:ah}ӱ$MzFcpNSu\6)w %E\-8p^-YiѾV 2GL)MYdQH$ %MF0fѤhN/1禋FI$ \v4\ DhHSGiRb%2RMR p3vFcp* {n*ϋ6}+4X_n5 i ]Kctz*q!qV;D1<'jTbN(\9Dw/*0RwW]bD}PZlT 9rA&DLQ|^r.H'f 5>Ͻ61 _1.- *"p?,2`!QdYMaUչxp 'UINaVT0@-8q_hk|²2y{V{^ " l nKqm`&WH14vܒjlwe?~:;xTU)̍F$X[ cskOj­8rhTRfh EHV'?:HY1(~ffLW"Pspmq@jgח4SaNx51ml)En9.%$]"T:Pɡ +e k9aR\q$Y%aTY>.nP)G`{vc0d0fFhnE-Rh4G5in^^cڿ$N'}cDv h V&R+40 yu)Cb+Bx>t(Ȇ/n E6e>9ߍC,@\j 8UIcBq _Z"Gdw"E`!o!4ZӠŧ;(iFcpX}ɼ8Y (04s*2YEv/e `=`|BQDFh4G+Fk+#NC1b8l(|FEm !kyk=9 eҠq2DxgFcpt<5#EFh45ꥤpF[p5 ɻE5;6F g)*n8 O80F-^bND"uvJT(8ana1]Rr~8 FˊMF@HyF45"5CFX,JXmx;FaK18:u^dO.*pvG}8ݟ#LB9a#X^umiM4_yg)z)IOߢQLr 7oyѰQ''E?'4RJKԍH.:@18nj\*i;Y'3Brv7kA~ w#hq0Ené'+4/-4WĢ4Swe߮PE4Y.jD6nx}ӹxZH *MS1Pp{}ڟ|LkˠzFıJ h :qBQmi*TASJ"Y(C&/)j==TA3i8S LA)SOh|<uE@pzÅ}N' \1.M c2Ûʭ.~< 1y쩩]5*ӏ,ʇm,#G":@18$1 R!ES'kP+ڍ/ʆ,_g@QDBC` ʢ\bT?@[, (=#b^Li4t;vqP`)L.iZN#i ,p.Gğh4BPVEUeo Gb1ד2W ͮVuWɛ뇭׋eeO3>*=Hx_GF3idH-Qӷud[MOo dDѸ(˱~"c*Y>8G#0SSw`ɖwFa.["|sCx:Vgݙ=[[.wPSo *)AwRzwf!.Yw~fq?'z@Ug]nN|#oc ^hT|#Hٟ-Rdpvk;qoų|~g){Mlـz^"w>htgKG>8F Ia_ϖ.>E $2#YB"gQzt s-۳<ڳ4[8g@E\K۳;]orOY?6ozgE!z;62 G?|wv汝zw%,F~ZbYG?Aygw`0 G<ѐ zA{]Qp0qF;m8QXs@Mq2]/_^.Wg?;|;˻᾿{wv]ܿaU,y˗pjI4q=߾^=_F o>?FR*w ~9Z x /~BeOYP 6f5>-ay]zf?~Op6Y_?"EؐFX 'A Q9lڨqm 8l 0e _{.FZ}>IG*o,h糇/83/~ Ǭ^Ϸt~x imfWP +&>+ _g;ħGtgvo+]/yv뗧_l=>C:-?}{i<~(KeOk|zfk`|vQ7~n7`n.,om\ g+PL M_aT[lQm<6.F,Z:I p"1& @‚0!ȃc*Se3kV8z ~pfg>Dc)S3eN4|Me}#0 ?Zfgq W7y OeܯG;ήDm)XK\ LYs;SA|~oW 8;: ΜC1M@LtHL.'uӊ\2r'䴓 4<61>yQd*0:ŧdb~)0j _t\VOAƩt&._33*#pꀫ<64:Ͼ1|;o:DI SL%1ۨY=Tv-\VM3S,'wW x=uCڻaufC1tTz$4~R] N^ zmz-ӺKXH2tY +fia8 3743[MiQԍvFg?ɛoߤOۓ7N0Q'G::l)ŵJPgDzntwwqW]Cbt|·W.M!h+z䖶 Jwo=*]:hr~V˳3Sfiaoe5y`A]> Ϟ͛Up; Q!ETi§ޤ#(Қ`.Gy!] 푕,HkSI͔'c{: uW6M< "š P)-r[3owxV 2IJ'hbE9Z*`)`0ȕ:Nu:0i߼&]LrJ͙"8im"#d_Ed州}M SlJa^68R$#/gysQ vc StS^@u @ʙz=T8A{Ţk<ܻ*kvRDP3J'<ھeeT1ET1_/c }~s;h"{&CD0E4^JJd 1̱%0lΎxg`C6bߐM(4]a)ބKQ)Nol+yWz~)=!gI!HY(#&ݫܵ; D5`kvŔ|r8.4<0smJG3ɭ`i_e{Yq?=H.#c=`:`!7mYJZOX.|NmHtn##K<>R[:Hk=Tkj1"}`ù`1X>wUmI 9Zr;`9M=b{b؞jo~Su 9$%4#,0j9*rw|8/O<×㛗o38]9 9ʍ"6:2,"#aREb/!p338ùuy.Ⱥ%U=r| ;' B550m;bX.(`l,AWf2" hbmM#[ʖdU@@!taF-RDg7R3)gN$( $7< l*TFJp&14ᾱ:5ՁGlT1:3P"4+r/$JU4QAJb`UR aEFӎ3rv{ U@qrM)!FJe\2q ,IT0ks"Щ|y)?W[' <pϣՒY7D u>P:H^LFVgPM 2+ňDAo(fRV fK $'űWG&t/Ȅv1T RJ4F!⣐ f\>P Fcua;i$r_3^# 젻7b*j4#L4#W((")㤫ѮAѡx,X&W{c}*:+aufۖ!%TQ"RU@ƾIL!9 l5E/C Y8p>+/ VAՍl~h"$)smT[WsV+ͦUC'ykO7t in֙ŤQ0i D>GmWnu1ȶ^*0^ Is`_&.5=+A,͇xI>pCA[ԍvFg?ɛoߤOۓ7N0Q'G::l0ҪZ yWD;p۾k[tMۦkEBv9~oA\#Qz};,gZ8◳.q=/VO]*]|HJ'hb$g\KbA%Z,vgz랞4_*tPvn&A\|b~Yt揨FF-R!1Ӏeˮ?O':]ʩIU~*JCp6R!it0& g8{*GBŠ~KFS1}O#bHVB%ՌI9>̲K-WȼЕ )JC^ r璍bU@j@w#IFRuR.pmQMKntv.[˥݋VvzQ>Y\1ob1[2|RLAdr7%7Md潌푌Mlw|Y}cs hkkѺhjI!#k9ZENxfJo ֺ@ #-u@W_^/кh3զ+_\LT@ ZiUr3A PC! ^pG\ocB E֞pO394dBS⢀٨R\I]sўېYJy>*hi 1^UQŧ3!xi.)B֤ dNj UQDgYF e:gqYD2%F{(MT>A@z}'6i;)[pyMAvkY>'+K |N:ja{$ֈ03;_!4 8Rrl 3d؅IZ6B<6šxS 6aExew9k ܇ b+tpfmvtĸτgI{d;jrZ{xh_9a[/kg@<m䭇N4k% +7 x%S;x\Oж:e4A[]Cˎ$72J; UɶySd HSxcmKdTkĎ|jm}7[=:g6/9"&ZWA߅Ϟ g3T Dʠ;'%B\F`ն+aRx-ߖQW%Ng?Pu1a7ZrəRT~~fT<5&r!Շ; s$gw5͐>_ '3 gRZNd|=5&/4$jqyÄEM}]'ӯ3!h,b4qR0"m)LB}|E*~|vz""E~Hr' tAX"Rzʐ cBkõTmHG익|%pMY#=+͍i9,7Wk3ig__LO :=|Ԟ[0K83# Z3a5lDuH̤Lqu 0mKˆTsuqcgz; $kޱ܎o>|+ڴ߽Kc-MMnXr')֣C[Øs ],6/v&%@ H${t@3/݆yw~ȋw}w5]qjj圸,Z" UF'JNIև@}ev^4r++@ Z5O`^`cGWO̡rh_9jȷbG.2k*(=垃ȭV9x%lTKI}JDIE, T%3%es9LO{9; ",Y,x ,2?w^z76Fd=0S9s4;R{iieIZ[*, Y7)iO:+:ΆR4lFŏVΝg+FYNI&y6ށ.t*7sEBO03a@FYqc>(pIη;BVgP+\_OzYhx vg\i/”Be A#!`DtW(itXT#Yxbm*÷1}o򪪵4^\x3nc:q#@B3ǬADA,Fꥭ$Rz/B$T6t1>5,\Iۅtţ=)Ƿev}X]4u<׆'I "'F2hO~g5Wn3i|!r)ȿ??馬:vu'|2{dO^﬛'e2[jۂQrK9C4IeHwv'چ|ʪFI+*_!o+NOۓ'GQặOD~6`<$ddY(m8>M{##):J>m=!M I[K1iT,G-g1D=^CRIj;*%IM.[W4zF2?kKtF6 i-I5#)"`PC)M\qt2)z $m dVk "oER)ޡmS)N6UT2!&eD I e^$8s(9X@gVBh%M}NL:E9":.DN RS(3_":Mfg[ij^ I 2Ԝd.>~/㨒3u3CpI$4V萩(;1L$d XʔD#d>MJ};lu5ۤA׎]g?ߔa7'[=d̗`Z:/τkO4tTrj-  B)t͍'•:)ݍo-a OMn-@}Eo5 1kC"#'fċ%3=8wk Q4w[/_ytQ Z3'h$olg2{jV䜇1Vkl*9!-J-]cl%i` `ءjzfIu6 v u<'ǒsy|KşW[bsz2-kd|UE D69F009Dϣ)$L>(!kO\I!sI^ U:5.YeiR֠4A,kxW^epJG*AF"VqD+5C OPv6{a]met0As$`tFDVIFFDѢ.\Dٳ˰b{n.ӴAINCLDJ͎`yTQ1NR$1G=aM8i-/޺*m+:cL%}jJ: ~ Ʃ +>OYb%.s?ÌgHlϼp>N} ~o~q3SG )u~=l=ԧ5(Qr (l OŹC?I_ gs➓!] l*3AL̑鵑Yբì. x8FpNͧK8%vٔ?lA%or{7?9];4jIʽxVkl5|ii3dd-tMƒ4XAx 6:hssc}xqa"3'hǣㆶ]v%\E6|^/%q8V1Ғ溵%7t֌hoFfV1dZ(`bQbIUYڪ`d[mc5Vgee2R>ObyWu=>ittCK9Oڼl4in_~<8o?C.pߓ(0m]]Q}Ӛw4lZڦir oѮ]ݛ#,n(^Z4In4eCŲĭNwG{^>Xdݔ6@-+zg dGrf͌frvY><_@֞>,ڃeps0P8r;"k@_#a2\ ,V9xnS,~ɝ  >׶HGt9^D!D}.%t/UcLƗ9@)> ) 8K`,h7V=9]unXt.!; #2ʋl xk=н}|PʖQ̨jx#n-5kr F{5Tt135 94s&ۙVjJ7'f},mpV,3˚)C\"S:3ϲVHaAj:& AYD ;ٱu֝-lnaT31EES;)BȐ+%gT<󌨑[!xBT'1d/oz =ωtΪQ$/!"=M5ZωR;w4= jx_~Ν}fND gIo077 fK2,J&{эةxηCUe繱&dWJBoLl,-2$1Zqř7u]WMlW3yrA*P03)\ i$#P4r`.g266hxt(`Ӌ1M]qȮIW=x/"zh5('˄>䅒M fOVk^e|M韾cj̬f<(yQ=U`5ź:vȃ3THC=2Us?H2E3tޢBЮ +T.Ph3(qb}'r1l{X>w}Iw\,rv[c^+K$''ɢ!--[@/F2X+&Gܣr xRrW*{_yUŜAϝϒ-WgG8lkԖj:[{fFk|3P)5(ğn7+}}3`< ukVoG敯NQ]{li&D2×N[s"\KMB % م* ,UjUEA.q{A;(dz%A_}2/]!G?uf"5޹2+ftŻՔn^ny[A*MCe:A0۬yv5'T@VʳPD2}YPPo@]g׋kPRkR7R(IJ֡\Vd(X;Y`韰2E 3+Pùu,ѵuD}`Fv̫9Ph-@OzBkuu|4n*t0_.g7R3 !B ~|`89-I+[ޒ22٢FGovW|^22c6QЛp/bkj&f$P3P) Rp&,N=~_67]T, #(=%sO1A1xJi:&Aw[))%+ ]%*-^'wN+LQQtYwOϦʔegjL9Хrn6'ύfߺ+tv~L49&g6M :DbRH-33Vۑ._·o9Ac;6)k(v-coݕ aczr-}~k=MW?r;.o:DW<}Iú(}Y-d6W:?M*AvQl: r˲)sUZD}mdA2MtHa9գQV;(uL^[c$Rq%d:gU chʾܜGErd~@MZn,q rZ+[d[ }laHDBre|WPV\A6xs}!\ח9,9.qHn)<$Й̃:yJqZ0,gxv/ +][6zv36q6L퉇ϽՄ.RUUhG/F=]R;UE"JSPI_hL+v׿'k\ [ss m_=ʿS18"7,'T= E2w'D+5C$ b/ 6b:MyxN:C"]+MV$#Hj#"hQz\.ev1=?iFT00'E13:[ >ze^["P;br]Lq ɧa24js%n,`.V hPys%jX""G8L*yg I 10P٩w%3fey=ՇF'4L׌L@ 93PSLpi6lxd\ơ$/9)!]MH)OnTd<2GFFf7wW .. xmpX'}ͧK8%+95/V$ *e]~[U{q4jIʽx6[=EM.߿ihZZ.f 8'[qL&^B`oi6Ӄfu*8 egN$GG mJXm^h?Kqb%ukKM-5#ڛQUf9 AL8:(XX,t2Xzvd4mpֶ*.jX gYY5ӳX^}>#%"LJ|0m5n1z)▗_&A;~ˏ|?_>pp{{:&a[|t~~T{Dn4_#IOd>(${f#Vsa7Y/cP'@&-/_޾߾i+j[6٦iEvUn x[[7r/K6SN+9QpY̯ZkF|m~y++R{xM~k4nwGm}i DJZ6~XoE3A*F$jU 'Vޮ$bo(Gstg51 EQ0" Pb#@ *HR+#Og8κ4[Wer;w D(ыrJU1Jf[ShMY5NJ%>̷ӑdڊĶJJ9 P`GE1'5kW,ȂQB ^A $X΅W QA$P@ہ퐽Y\Ӕ%c1-Rɐ ֪CH3c9")_s\^edG1>Kvۉ/mz;zw܁Պ0iRJ~++o޽a0~8ŏ}%wWܱ_}(_G??a{32G3]n?-Ck? )wfHImw{j0u򚧷wYܺԔȽxb]:O x;u׶÷>ls^"fbg= lB%]8\-=<;E5.[:8XD!t R*Me)61zUUAoۿzA;K(e K ɣ,$ɢCFTd19N]B$(>m v[)>NNXE{䡲(%vNg1U_](QW-0laDPЈGO\OTD$%|1[l=%%!SP.:+1Ĩ^%c :`19bsו|:c>lPJC"uUw 3"C˒7o߉z1e.C*7UMꄮZ@Bޭ59\Y4|2c`8 !.ۙ6j&: <Ϋ7[4N(b{ ѡYbES9#QHʡsudvQQ B;Vzj ޹ YgRN#>$E"O'i#,9,AȄMb.AESZ>aMdV %DK|䥆@gaG VgP+Άuo'{(@FTTs!'&lYBƩZ2fw'^.FMkI:p0{mM^Txq"+R1eBi K[w[ڒG^_g[dWS mŃ 77)xcâ3hҨTPȽKJGOЋGq ƚ8r2rj]9AoUF;4^ď4#PϦ5d')D!/r3MIc#4LD(!@A5|ב&z]%jkіFIk@,Fb3F%d㽏EALggnΣno]2~m>v{@ IrL.^ڭW=;3޹Kh5e}uL FQߘ(!ecp؅Q33I`) Y{R"Ic_|Gx *|0 ũꡫ(mMºCwH1}@&opGN[tSJLVh3bs>4js]{X\V{Q};e[]RH?,2Z8" eUVa aa <"}#H[\͙L"#>;A(יdT>rf/\X]SO_~I^ヽ˺^X*^+!iQ+@ΆgvŠKN kDȚ0aN8ۿ}jMp(mAJPhDb,"[o%ȁ"gt @ u+HdN5d)dJh(`ft9 %J>$Jڭs'K8܂ +ps9L vPؚȳq0%O2$+ P5?H_뺧0ʘi6gҳ1PSBI!y5ήKh:Wal:C,Y|Ͳo~ͨ(QJw*bVN_xB.ҹ%ϣr[2 YO-מD?x>>QyԚ5/pZn7|Zh RVbAUHceޘOW7#u}_/>.>x{7^n׸;"j i"ϣؒ[Ԍ܌1Y~eALy4n(:UW^DOWm^^]Vx 7iN xo#~.`JV5h/nmH7ߌһ霆dyԠJk΋QpY̯ZkF|m~y++R{xM~k4n HķŰJPJ8keVI8Ĭb$L"EO]IP:ay`ۋ]A3>O`*t'^I];k5[ޮa:NCQ!{bؤ'6MzbؤGZ!IȏE,j|C%K_i6>:һӦ?j7`Lbs~s8{l[MBb0Ʀ(B4*@UFnfH/!JsԖMh yNʬ&2t)lA!ՈtI}8p "KFZy* ݬ7?bڷ[h=Ze4և6ccŬtd3366!{j3\-UxqyV U䄄$ &_tƬL)Dw.HL0Ӑ%;O+O*`rBwﱯ8m=+*ܫrёБq# X\㴕M3\I)jA+O $?XU;OuW- AH CGF?X@2(1VFPQBzϢ(Hhˎ!Ȣ)>ao ,ApҶr uعk;/t˔g؂g6+d`$37#-kcIC1f'^n9EX"\Jar)R&HcaؒlZt!wɦiUM ܨ;[Wvͤ7|:8xFHqI7%y_R^܍8cPL:'ZWw#A.9ޟA[ )< 1dDbB184bv%F }gr|x"k!_S IҎ*[FDfE {a)e@V 5I?w\?'?A$57-Ugqoh0 P1\ Vb9Y0 I 9R׸ 0㾾WJ1S'zՌI1w; 2*ʹ8gt^NZɓ\ůKO5"209~MVv߿ X\ZMG5:p_ȧÛ,Mzj|!~`e8fQ j~oOB7r3,+[&M~W\r&93_)gBPTq3s'oD"V}+GeCJZ %r4ӎd_[iT:=](z78&ZTANL$G¼~E Z߭RzF6ޘؘz8}s<1}9i$>S8_@c\@heP@9q"+qXi>vq*i2UĴ4b|>fYmJ1&ψz8#rsJpxKWqϬyb.V5-^s\ >yՊOS,Ɏ-sN 93lӎYLq5$heěcׂdM5!3j00IL >k.6RyTsqҼF#i1Q inaN$x: pNN] pvJQbDOq5"cJ$5|NqT~l/\ ocmN#4젝B^Cֻ ;o܂i9&O+WB=%1>x4eH6N^`eipb. <ܠI|-+"T<_7VN%A;07IcvC{ {*~8awǨLQ/g6e1v7<,- 6S`$=b_Ph}EԢDB0ew+<vPk [ XQlk.Sb혧; +u?W[L%Om2UI "x?jD*١ _,827S i>o$ᜏSHoeN͗D>}€<6$OgW3Hj{z1eDLbѽuFPCD؞kdy%qҢ{LXKt 8gb?dKo;?uû^Q [ْʖ$BkL9;3Z6z5I{5/)/T{r=M*btɞ 鹬8wts'xa3>Z ~6.Z1J808K C>a.$wO$L>) A* BbB\Y 1u̺تV Ĵl9ie rP1 Zc hV7p@3`<, rR%(a sMCۖ+(`E i2#$WKB{ [>L?ltn /;͹/i1ZUa/.mQRWՓ2ܴ iBBrzK}}3J꼓t;I籓t;IN7%%+sRRVF*Uk\DƖ۱G<` m]0-x$C vPX2+Wtz$h{MEBb3GhtV'y3+ҵP.7RV9 "TL8jP`yaNA`YoJqky̕9Ϯr"آff](K4*Zf*i([) /gssz$J&kkցJa`{P`[`%f\#$ersx0ɻ[v֚4]ŕ鵣>o0`XFLvgѰ=CQ?p4-ƌ1t"w.m3Ρ㨟.,.6< h{/{tn ӻ+a/l. ?rhuު垛yF3jǞ$XHa:jR6\\f]X׋|* >қ8nY`a%g>P&lE9D9 wZhmEmB%lВbSh9E4WѦ#o leO% r1G1DmgMJ˴V羀oܠ3g8֩g7}7&&fݨ*@ 4hIYOK3(1okC Z6nmpg+i :nq%@–Wj}I-sJA+Qk&Zw͋)J0gܘJ~|Pa%1H ̓mE * CHΰqbq{ rŽW]N]xUNyY_ۉ"B@xk^y\bE;sQ6R YjLJ\s@g,S L8Nh D#G$\SbC`:kRt*u":+E gGlѷ]{o9*{&YWdo63X`g,{-y_%CENVXv?X*V[fDT {r5NP!/'O&'&|&G!vA᎝᙮ftUD_>:Hw xwb^' s˴?oei;/YM a㐯d۪L^o6AͺN'mw5-׵J2i0 ] (t\/Ď̬ s|7ks[rfkR՝ׇy$T}hP11:ͤBy@62]ݍ{S3CO-qy$EP>9T08WPE#(yA8km,2CцCvS"?,קyt1R$uK!j^Z=!OGszO~6JK<6 r'wm68=4ݐQX[73…vF8wfxF_yǥ3b;Ւgk ͮܘx}oiOJ77a{A_sE)NP$ VFc20ra~N}M;xWmaJƝ5]."Oҩ(D/I R$?έ) 0묣%:&D9  *Zy 99EţQ6ج;ijզ*z}{Wu~*SVg$jgߧt!qҿckg3j̺]}4Ou;c&I-X2|~6ZK{Xd,t6g_^<-valoS7DЅ}s>OU_NF3?MM_b:0i4:NSq]:ǏlYVLjN'e;\F<.Zd&W˳ fU@,cG~;`V=y>''=ƙ#Qm̉IM' f_.~z?GS][n{Ab[X^eUg||vÎ^b~wgBrFi'ѻk0A:<H V-*A.J鷟~xR@!"d<1gH$z$;~w;WBh\ C4s>VuO_Rfm1C.fYVyR#3rV2DdŎcXe%壅i 4ѯr~J:-_eRR Iԉ$)WE뙖MWlv8m^nw>Ce C%Ջ]{3rK!мI$fM.Xヷ)IB 670Ef3[0C:'xY ZIR1-1VIE })o@f{V{I<2We咤5ˋ M[l-D(&0z J+HgaR(?B H`҅b`ȳzg6<;P=k~7g^TI$AD*!ZΣ V\Rdek(!GM[ m(4OR5xI SIbfVg9qC(4KuL !LDupX!wkn6ېyE[uS>$;ܸ -ۚA$ b(9jH\r& 8'{'DdʼdҊ%N](sw\n>$Ӽ.ZVRped@xnC]L htԲOB x:kg}$Nodo7)o!>gF9#2Z[$D.aʴK $28-p(iJe*Fs뾺 jvwv]MſleC7svy ^_JODrcYЖ' mq? QkP1c*ɬNX$(e2= -\'*Z 2)G_?Bf{%7_cTN/ڋA^)+.V5lTZA@a9lُYtYVMy5?tv2Chȷub隟NMĬ%#NDm&5 <([ Xu{HH8mi/@e8(Կiu꒛47Y5a; Q*8cc*F-F4ἢ/31nRF+^I1x5+4>g!%O!XQmPY-dn&DI'l\YwkV\gTKW ֞#V3r7[={=TzB|}ObNni 9N9{ |:5)<Q0uHg T&9RI5^/XjkuU|NQeM+) XC5,2)p{PuUcE;"]V/n]zgumaW&`8B"Y IK\]r4iسӰascmMC*[];8HSJ)³ҖH㌓dr|@* Bhz.-%O!+kpy

:U%e$?eX_avc3c7].']]WݧO7眹{1Y#fK@ RpAqpqf8L$ıI#*ё9)w>sG|{w\wM_h@u6}T(OOw_X3}_F$Q}I/ǧ_1 @;dvEMMp9gW|-$]Ry7ihz '?v^?,x ePיcy<-w9k"b.<?ID7[:Ԍf֙--Ĕ'XwjgmUnurSjeWk6}|r֕9*|(d&}CQ(y&mAl<]<b?Go^WG^y{$=z߯v?̻k6uaߪ ݪÛVUުi8irMKڽ|^wv8^x7ia<*'ǧ5wwa&YtUq=M6-El$^F#qD%a&pWkCN6q\l~RpPdоUͨN,c.ZJoCi%ic{V2us:R2Q e3)<@ xZGzjj,p%b|Yeba BxmATE1 1mɬf z<|BI̙\ҺB4=,C(OZ B2 ]oJ^IyeRA0cpmnVg?zvؗ N-ׇ6< j1{ZʞV=ieO+{ZyZӺÔF_mD(R --%L`~ llwzLGTvrVqȺM#֎!O4wj0 b*#d۔.p EEXH>&L.DpCL*TQ ƑSy[T^ЉBgy͎><(GjǛ1RC2++p|Ѭܞ1l{Dt`QZU2Z9猴:Rc!TؙyxKXw2MuId/VV-@;^f'odBE2sd"Hf.d"HKt r.Q__Ao*R?ɧ\;fΣ K(5'4!*NuQW^_u3[+R<F`bKX\X l볝 y>W/rf=yڒ*,p4y@`B9_خSCE10mXF|H ]`DZʩR ˽)׆4c-್n*8i]+ pvr: q&6x& +>&|֚P2zS9n`+)&SN%m76T{VK1yu5\]M܊3aqr|3vx5J$JPE#&x$ yIEt$LEt!"'U:^@@/hE.qVqMc&r[4=2G6HcXaXk10BB2pPg($^2B:UebIrY$532Xc+6-J+-.WAEj D/o ITTʖ~Iz;:opOd_So&~]Cqs;ƆqsPI˴ p6O_G\uk3?LzpX 8^_wU>%YMayo"'>&\ 5GJHM~:"J0.qLҒ2ׁCGrsdR[31&8NqA;a f@<&9V;$6 J_)S5QhʘrҦ}kMę`"tMyA(*@Tv#*Dթ Bh% deHЎ6jRF5)oh1 -DTYDIeSU~O{c 51]mJp\'"Ǹ Ȳt%*\IO#I/N{k@샥 ߍpBC%\qG>^"M`m-BZ q5J%2R=ULY64ts{zs)dј0JE0VždB u`G|E_>TDh*$j 4EM=bH@6("!<I(G= iD=)i=nEZ?0hFUixތ QL Bx:q˴:A/1Q4cA,=V:w@m֊GZ;?kW4Ç P:Xb_y`ſĝ>8sjx?<r(b{d:gi/1V CPD0Kt|&☺c3 ۿ9O@ ^.f?a-0!pLHoL{ap9z>\*`pv( @Ň9u~BnKS))WiWN&'fMJz2鏏]Y˦Ʃ؊9:uN3Ez:Ά?OO?x= Si^ | bwv^Ӗ[.l ~Tpo`N~|Z|eO/bU7du7fQXBa0tP0Q1Y3#EonrR애&Us^)%Hꘃm8r1X̋|a@A5 _zqz _wo~.~ޟbN߿P`\SY5Bw?w; AS9nko5UIתYO ޠ_!ۜ~.n\=7C7v) e[Y.vda_Tr,TۃQI~G _>UR 17]$~/"P|Es~n9`^HsF{5ҩTd]3exp9PLw&Z1hpP!Ci좉5#~Bl541%{%Q烝Vvaru̡v5ـo5,?8uƿ?l;Ӵ鵗 & I&U9WIr5i?(yJ]$Q>O6J!^ڠLiK{ARL2^]8&35R(s^ ,=fC$t ,F$W4s>Iխ{l+R6]]+kG}s:xWG=HXs[,8f艼# ^̏QQoB}wbUyDoOraiVu>0P@y۽+!8LaZu.HW^gX*8p<&YZXTW>HF:eS:HLi{Mۧ u4n.n/J'8 Yi~S\q4(bRVշ9;p!01c~_y< xZWF.ד p)HÐ) whrq1Lチ٬1}/_7sz2I’L>hn2{9d>QG>}2mN?׽k{,C~-?_z|{.Y9N[mǎ 27{^i4_I5frAsBnaE!k!s=|t݃/ ?*'rɻ>rCnOwJ={h7[:P8v"h8~zi7]_{p_{yy]je7-VgңA깼XߝBݧ֨{O4W_]oܼfL/_VӲaVE_ /IDG-w|^\;nѝw;!n_K 6C㥝RMuVF w=%kuh3ɐLtC2UcƯF#LLk)h/PpHAUZb|Lb*z5i~^:sKŬsb[MHS,}q̋oITcHz :{ֵ/{\YvPxtٓC[=#tH>Q:f~)T־9}|5y&q"> ;ľCW#Ilʡ2q2,ϼ2gOڔ,u竬yiQKr1$ ,@U\-QRȶ @9:ZL"g⯼jwc,7K]6OFKc+}5iaMn uiwFAMx|jӠQ=}yxoae3xխ_OQ>M4Psa`WO0LQz(Ƿjb#p>;WX tWN8&d֞m;~|CtewKn\xuvqFڽkS&,d[ )BD"j8"9(-!k!j$ﵶ{e6+^rs_~Dx' |T<(* j[p\VI I:ۄUj0Z`9o($Z`[ljۭD1)1iWj%3b]l7 3Sžo|K.Tc(%Z{]P'ˉ?9N]89`ϩ')9n|nn<}7r/ߨ;Ϫ qFJJ'Q ֻ { gMykUj͙bUTDr4wbʦl3%Ų'LS|L6ORm DTϢjxmEJ{k C^qp[\FE1dQz7fsA={KPϥa8u>u8QV9̙J̵%#oEv X`mI<'D?&bLO)ZA|y⤧o?>=}fvŷWA{$nc  B3ؒQKdٔcqQ/h6FW.i\]d ^T^.G# #໸iUHλB,a˝1(]UUͮT5 16]F]=&(d"r_+-;78pIV k1jւ69FWA=&T5[|i* yn]%A.cdu$.旬{F:dUp(6ybJMߞn5)^u1>9o'(C)F p0pTiH1A5N.rRh3Aovym!O%IF5ԤkanHE5ko^11۬U&Bn:%W1];n2~Yݞz[֜§27u]>ܗ:@/n~L[:|ogNG0U[mABe Gs]7ry9vzhi2Q8wϡvzOVɁ^9Xt 4pS> AY!ZCI\hz|=GE -|a|Gx *3bhE_xm&"sMqO*vެArtfca;dE@]ɣx!̷l]W;}^7,<1}Uj2YyLdgg ;Ol. /|x_㋟p/W 3vCQߝ>>`R 淟K}azR@^^ws.O?):%}}`!!ȃo.V6H a`VvH5!9l!^{H"~^+o>xdC;ߡ$ e B,%4Sjn-l@rnY`/l,I +*MdS *L5>$B1%sn VSHVLF!N>:5 }P!ŽLj/L/o[o<|~o/GS<^x.&QH"g5^efu 7JubݮNn1pZH~-M?2sWQyc")sʬT8 ?17As(p58|<ۧ'fބ.җ8laƹ~xɹ\:q>iqhӣ_/|rtDO;WӋ_zO=Zs-R,=ykoW2^oxj&|v.kvLw?ߚze-vEk~8p %` \ؽA 6hX?v4K:( >NRPu)[ nO ;#7hЮwͪW(=YF``CnppzX|CwDxq泺Rjbm9!ڶl4[=F@/:"[S17=2Sp`'5?/"]j5ߦ3rٴ=^DP5rhUrhD. ;zU(;\2Eéo6_\.Q2Zȇ]J-jWD6u~|Z+r9:"e+5A7{$ZqrRT0(b%X7^N11jT ,Q(kcJEbt\ q󪸘ט; c& iImgD<7r)kS|=-CX9%VeNcmZS$G*mՌ՜kBӚ aסV)ӷQWʬb$v-˷ }"GfRM&*>!kՔG&8j_$h`1OAeXLYR4\w&c%͗ R5mLZEpg@EmNrJb',a[@QR[x4sp9SG*bD9rAVKtVaFc1`y28цn5e4F,qY'q m IsY~r*첩!`D@ 8G+ᔦYoXJKIh#3$ |i%ׂQ"8͏(JS߾fQB\5$+VK82` .#L}9}8p%LI[% ћ*HK%KTP qW90a 2om D`HS;EVb#nR;*Ha`ξ̕}")Й'\L5Nŭ3i_J-7*D s5S%Q DeaR 0}!T()6v]d{oFM59ѓStWJ,!S;ECϭĒ [#ysԮ (dza2K*whu"B  FX]Y ;w#lLU/CV7Nb}r+Zִ<1mZX-h& `::pإq;<[Jg%c0J ңJ ];#5̮mc5<[IHu=ߗEJ-Zb+,U1Njx)3v1nV[$_jPaBG1 A1J HĄ$1-+D^`eHT. U E:YXeSOm( AJt+Sx[ -6aA,TG ?ON]iVPۂl2$ 雅xmn+NCewq2u1qWCjE+pmpa,@{]m g^ @n@lyr /룫x*Geۇ@ @0IaBA*(S`a @pȃ D:\IX@zP̨D#+F΃?vbC МH+Ghإ,, $frP% $A~2C #RFdCAasWuCϢb! ӉTIXs?ay:0M`Y5 3Ko^5RR\ ^"; otf&= pHd_5ۤE !k C:[h:YX ٻ|%:B4wEn/Ie&AKP`b+Z`8FL-FT`ʿ) ,E0hky.j HGȪas@n0| 3| gZ  <%"`rHP.H7<,m$AQ.a2mӲw )0HH^GDi 'O H@EXoبגaݼI^!|zQqE r\B\ \19v!" ZZ0raBl+B(!RXFՒT7G=kUˠ~ a=+NFv2r#SZ[ȱz@pkO,%&}*@$4. Z`5#[r5E+/ΨmMg TfĬ>(F ,,~P˙-+{ 5 у ~4Sq#Fx1G |sjTgC1IXU_.u.cdnVMjSMH"| ]R,9tP`- QޚҲv QmuEBCީ쌅Gh;Tv7N5ҟa8l;AqD6 5OHN]:`9dt2Zdmam^f7n=omTTk'Ii ;$&aA۠IhL1^֫G"û B8ͮ6H'%-7{m9-ln~~v:>[S]EtJyU_ ]*|6O݀z9m zoفo?=3?Aj+Z۲ov쏱Fm56_(I6(Nj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r%W"~Hj,&:c!Ϟ\ )$r :fHc@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 2@?$&SakJ<{&Ԓ@/Eg_ {^iIi"O7!lzvq _?oF|~4ç_~*]_y:w$8+#.wK]"qD%.wK]"qD%.wK]"qD%.wK]"qD%.wK]"qD%.wK]".At]j}0ܥ&=SKasz牻K!u[1.Ӟ>I_{<1"׋I~Z^QGH\_[Y#2M?Lï}0L&Vkb2Ld"&1DL&b2Ld"&1DL&b2Ld"&1DL&b2Ld"&1DL&b2Ld"&1DL&b2Ld"&1^vk=|z+~qZMSe~P;gQ^L qH+$C_WV>CJ%z ͊'ݶWrFOt!-Tl?~ ?fzq6ͳ׳q*Wv|}6^!'BW̼˭ XHEHSOX-va8Пȯ"@I ;_Ӽ&<>}hIo+uy.bjHaO;G8?v”^Sgm$ K-"N n=IQ{ޗ9<8xB)8x5SyCG^4G"h @~{m37unVnW߰66|qSҾ?rE\~>E&2={SwzJn.eRŕKFﹺwS>2gtqŷ%. Gᲀ-k)[=0UQ/}%gO* +a֞iz˒ppםǐZiO\k>C)-,`.uV_b2A^aB^/i߽aqz|a)mkaM3tm)wb~g!a^fy5.{û gwu3ako&^"rCK6jn {EghrEQPRL.kʠ<W`@ L|' |:ݚv'W{xk.kίjq|϶|f9Iի[\_^4۶9}_Qaopw].-MKu:H;j`}߷FcϯU67pyױ'HxϦM).'-IӳOۋaU={ xgoѹ5a|1[njw ==cb6NBnm2y; 4}6/nbӝ7:c:'^ݳz:gwv[jn[#n=1ڝ7wyu2ydy.1𲚺Zzx{ߧn.ռ|[w-={j>Vz偵f=qل~􆕡WG7;/c7x5k]ZeqI&L_Ǔ4}SgmZrq_{mNqOhs?.K^^䫛䧫[wJk 1:!u_Z}vʁ8<T867|µnvgϺl;i۩vAHXۃ k̡lҤoҤt6y8ɣMGK.tԗ4F+: :jŋ5ԗ$HKn ve3K)fRԽvpF>jn֕Ep H>諒%E2U)#2I|p:ٞC:8˓{ϦiJS+r^MqjGK;*s6mIBHog䤲²X%'לv;/B̩x.\YjuP[a&&P e6JF瓊(e;$N5=j`[L=)!Yϼ0c%v0SS RJbL腍V#,wFgsVz^Rۮzf&y٘x6۱ZezKz.x,ʷ)Td;Vl'\hdcKjkO}ݶw;*6hDOzanrCnw$PS.Sp]PIưUL[޼̃ApU2)r2նrTN 4KJ&\6AK#3w.9\uvśʣ: J k dFgvn.[I֞zqqzbwm=_E;#/]F0:5Cx3J/sb/tSR"0V(3|8-~>Nȋ3ܩE[Ԭ3 ukwョM;2 ^i ¢1}19yƀSOƷȓiks{d'Ki{`B(ѐUu6DVoAZtVQ!ڠ) 6rYdzC 2: uC$;)I`2,yPNA%;<]? ŵ*0XkdPAw*LhO؄"FJ1BJ &>2s)%.@Tl Ѥ:?4uYYЊN̋(dD"ӽ!Q䝲 E[dns&kL 3pV2|Q0})Ef(l(6ißy@YdYgI1Q&s 'NIׂ3F1KajNGdV+2 _u&_5/XUT89'Nڭn̏1*:U *7,z+e03aӌyGJr1!y3nDz밍'l8^' 3Whߜ}:ͳ|vL**bH! +W'g-z !'NYd&e|N֝GaMm/z/M4f:&g<juC+nփVnփ~ 5T_7ɫG>y6\Ih!BOo2pr;)?q6,?Ǔ $.O)YZ)3eYPl4[5$+F>A]ur5CC/AA-kyuRW|>$?WMɇ%:0Hngk9sNV+ ee.[S C^h/OS+/BT_B?ȣ̱Sn]GCn9]OW?}7}-Vg.7 |7|aܡKgwλ>\ڌ>tSk .ua;w@HdvqQco7! d.KHvPY}J\h2S[las'͗TE +V Χ(/)*Dꭄa%"?m~[slx4v |cP?ey zG=׋7xgݷJ HQ$t2D2$:(&[R1^@M6W IXĔrIYCt&uW΃ԘSU{HX}۟.˦]ŹBNi1F_{ GFRkK$:"=AOh nL%$ld&ABNf.aLӥ W YtB Y܍nr7c+Hۘ[k캹7~k}~?j0}Iۢ)-XXǻUoڭ2DW^|pZ|J⬭ՍO.bfg *nzyЁ?7?竩swikk/Q]ucܙ}Fe VS@ !E Of*%x`jM/?zâ;u5%Od?!L."^~ϺU#_h|I՘;dICο֌R ATg6i+GT\PdKAO9ijS8T S׊u Q䲺斤PL/NFo=6P!OJ@ T `( )(+ ]FgJED];az)(hC>%T"211,QYhX2>f biWDcJ! !+Y#HW IHTB )D&KlcduV!'f4tAcK=S6))l Dqd"xZLwyt~bWEQ9Wpq ȏ,^'ώkf3-ǘUc3* V|ۚ Wt2x}Ţ4Q?'2)qy-'٧ŏ]K^6?].^r$ܼ%ˊ5*bһ=LoQumz,*gA;fs GtxaznvQĨ6rV Gׇ{GD!^W:[Z̴ʃ,"[&ĔG-X%j~5gc#Z:UHoVנhH]X3"G_A #98FM1~ꑇƉO6M/F?׷z?Ty?$Z L@F{K_V_ZˮKU]zv+k-/Y>(q[vekK@\~q4ǚyf^Z5F~#5s^Q0Lgp<ŶPToxZbYx"/S_-bt[X/FmSWa(%zᜒ[Q'a0Y]A1Oޡ,MIPziÜ5-|G>l"fgQfkdŗ =6I2)Dʊv: ӆ4s]gCg8q Vv5,ElAZK:{vs;iN3^w6]s0Ыݤ\:fJi4>n_HN;./_MK ԞQمA6Ɉ=߾sKvNڋܬg?yB d,q1(E:l`[<ۘ\dÈW(k$SD6d`(v'|Nhp bXX]Q(tuB bGAtz"s#纚j,1KMM~rKahkezѫQ15Js=UOkvm ܠa$TEQ}J],D*VW²vìA_JVr!,eh DV%cC YJ1 lifN)Rb@odkaWgoT; |T\l3:ahE_U(߲ NuB1V6ў E"b iᅔL)3mXΎ4T@ULHR(hn6˺HȺĨVtJg^@'$#rf,& *l&r -X9km$Tܫ7,~޲P+R;F @Q>imw1Y6ڲ /^cmҜ, abq<3_jw 1WzJ ҙ%{HcD2QsOt4EnzC։kN8}kpeF 8z7=Ggu%_ e@g䒐]#-V[l!^/N!ю@8s{}9kL'0*wOJ,~wot{y9 6T܋US*?KEW +"mU{jmb]-Ml3 i3˖vn= 9 b0R2X,&g۵;s|G~%Şl*"ͧIC5.M?R5g w@J.#[ZcI" ,]MŅZ[ʻتJ6Txg/\,Npc@j@2*:kQ*Jj σSk f)\x_ S-ӑv8j> ; 4WC,ǰy >a9ő3l*{½-pj 7jiwC@#妃;YG_Hڦ|645~C{9XR4b+t@Hm |6g ":ot0Q(@*Xs&rEN*ƝH o&:/AvMgS75\~dC͢Nr)BSϘZ 1&+j$s^ >bO0SUq*e}5cm.2keAC@e9oA2/C[:]b8_o5C3N]d|_Q, K`~Q m g{͜G%X,DQjNLiBT,`O\~ms9j6{7-BKZXX%ٱH"a0?>Wʫê摈u |uV3^72(:o;c 48{-#hFs+RG7i8 \(U9oӫOyvxҞ(2Azf KN#(byo1Gl c_6! bB <*p/@aH˩R S\[lhqT%$;JvՌnF6>ٲi4+ѣ۫aILaOT!M.YoJ ~\2jYxd>ZҽP )Q[M*Ffdؒz6p9,K1t9EintJp=^Y!nn3Iܼ7BteE# 7 j-$* %^FHHLBDN:ʾ \HJuT-cpGa9AQN!{iL {p X'c/Hƴ (QHLd$&A:U\cIr,aK59dሹN:Iۧ=k ۃP̳V ITFYP5*bMZܹwA& 2aJrһw =Z=Rw_azZEcI_jۋx;2Gnw U슎7p%ГN XApSya`b|;pI3O+i7㥩r 9w?Cƻ".g |[UqIEë<ͽǪc̱weشU;TC4̝OA_ GRRUZ>B)Ie£}T#ފ#ӝco0ՁE~ќ谇yl5UlR[l؃QF6>8X@و YD1kH)Q0!S&aJ(K)µDX/5^#2f#eϏ%Y+p!pi! lS\TH{?xڰ-w 0]񝿲MN:#U{,8 ;f,?˔BH/:|Puj:s+"äR*"Q* J3PIL9|H\'UOᶸ2"4t¬ijCO'q}S0Ʃt5؊;R03Ex _o~x}=~p SxN̥~ET/vXmqs~a 8#1z˦aH05,oA} +5['K>og =^ٻMMNQ Z?lM6U`STHm8Ǡbb|N .u`=/ŘyxW7.\ÛzO^y7I EN$T@Cskho64U*笧 b\!sNa܇Cb+涵 J\y=?.PβvxFv~13$_w+Cl΅{d1Hw"t}7E:y t8gG>F:"5)[D/>q鼲?kxpV 2I.h`Q3BoM8êj;Ш)Ρ GX@HF6œA49QLP*Ȋ:-gء ߎupW*qׁHO7_xӇl_X?ۡ\|6f.?Ѯsd `@H ٧tIXku.[P+!ecJes"UpDb5C%cI>Y>_fgg;Splgq@7׷*H޻G@q2Q뇿@Ƽog, BAu.UePxcq0B1zU>-TAHb6 W  ա 8Z`s@|IWna0GZ3$"C]y *mƱ8YӯBD̬}3œE cUy δGb"n o DR15,fM"3bV`[m2FcP `4*x-.a*Y[@RZ&d [?0 A2gXC,Ba!p8M7-/ wxY O\hn $B.'K?ZDq sP#Z'LDƢjQJv2W^ g-ih֏t V8v|^8K-AzR|PkH0Ԋ5&նؓ_Fؾp~'WVIx`5s@x8V+F=BWo}N[IKB0"cJ([*<0"H69lU출-nDGdK^LcQVOiG{*6E^^oWgQ ft3Q xdr)+5_Š xoR2٨JkFs 9˺4?c9]mnoDpX>߮wLlSҒCWԌXPW_V<՗|2 e9f=R :=0aDD OLӳP08JbMzH /._d'iۃ5z:ܘ\#v {qNׁf!"T9 M>tst5/tiy;eu!괌@̥QW6B"5;(H; #že8b*TVx\X\d<քRkBL*kлolDd"[8TQ=z|\dy"/a:dU5Rͅ3bU&:|dAUͨjヨ*JV@GϷz]k⮴\ш& V3]M0vONc)`1$-k&А*R9:Y !j.ՠihfz?VcO]l0B[ 4י́"v7NP8ܜy`])}PXj\@ iw]Jd4Tcb z4b%h+hHNn4Y RLWqT l *6`J!icKT0-) 4mjUe-4m9$'NO u8 yTC"Hk+OhW$r6+8ND\|;c d6M5"?jP'/)4FbW㜜лP0Sf[*fgUQH(wMawN#fÙ37ƳXXM*o"MZ}"GJgQq4^H [4Z0P'ehEZ}p.U"tm9[ٔdȉ\sP[Ŗc AaP8䊢B>VCR %ɐI+ |QQo|A0 ,r" .QSvUW/HED6ԩuDv%>ݙ[(n&``XrHIki'b3]#yt$;0v(DSrlSA?O`ruv2L Hc:,^" m# R xye;o#frp0`TiH1A|O.4u<hKCҝ"fpW?Y` (S1@Ђ6TȮtsZl(֢1cdhVU2딼|idu[f< SÎGZG_:MΩ[:JXNBp>>`QZ,ؽ\%-ݱU1-L c"j2KoK_H>" R0+?Dk`9 lVQ3DŽKw*O60Wrs9jDJ(MRʹ˨vQZC8z߄hS"4 M侵zx-rSQ#>F!wڛ3@d0&ooqi!؄- g;B-W 6PPDQ'_ÈB_XS|G5=.tTΏï'`}z1BGfXޠ>g70%e:Bw.(WLA$*X43XdSg Qˋ x?q׿ד{ 24_|zur+`*2i/''Z6yo U~n׫Qm\}U'e}Lmworvm +61Jo׫J>_i2ν. 0~zz_dϗrtce--Ltv058+䒽ݞ G^ceل t]\y=JwKFXA]tl7֛dѲŇ>N{tP^0b:5űo`}w{r8lk::P%!TrL͒,\[*<`y"H6ۣ[D59 ƒ*[oMZbPֲYD&V gj𨨼`4r S-J:C؜^kB@pɶrgGluqv3j"8mZq<{{u{*V2E˯o}iDlHX?ӴdqWjz1v4۪BhTeLe֎T\uQtESZ:[&w.x.SRͰB(BaZXOͭ/LIXKŁ3e -gkQaxa+/윇f VԞsg(19vnΤ|͟ ˋÿ%&FYaSbZMI8q>;LT<&FHse}術=2kl?&Aal9 (@Q^T:d'(AP.TE>|9Hc?QGբb!r*h!g004!0RHi#5(RV(6OYVEkkn$u^rc^ *l/ٜKx%$Ϭl],ۺCv-KMh6@ sV[2LZTs$8&%(6M,pKٔ+.TJba, -L^]4|Rٰ3F- n d3Iؕ G^aaPKu& bkm',l]O!f=wGZo9&Nч'M𙙘$S0anNUTV(cKR MHo1РY/J:&АqA#s"gGނO@xeWkˣm/JʑL >hYLnLB,}VFEH%x>hBn\cOy&#+K0: ch%`0L6lT@k8i%'G7TE NcPy['a #a 6)&JޠD|V *-B$0b.8?5"H5rY0XP +0D !e2 Aa3ɻ@ ޸N k*YiIa*xrD`Ȉ\:y-bB jPjgH~<.`?#m=./]a/d͕cJWr%ZI.7\t;K" [+>ſ?2Һ2_l3*$_.]?.,-ŗgƇ[Ќs>8d4iM2#Υ'L`Ʀ*qAn OyS49׿,fCSp=X/n]ffOwڕ.'75?7r;!5% Ob~{FTsrf/˩mu^ze91`17 {Wӹ0K"bOw6,&֌Z;#ݬF`03 bLaF Y,;pпOaMoY;`GQݫL=:.i.#e`E(ƴ~rw?:Fr.6t'Yo0|zĎǿC~ą?f`Be7o"@7lC+^kho>ж]&|qyk}.>=<֠ \t8|Zdwr`ehiGGҎ4C{ X ǭ _CI"cS*,h@ؠQkO(uu BBڞӀ=Pzo &z_gYv*ՑT*ôRg)#;OK=Qkp')T !P[)!G3rz"59K)ݭ8tn03_"?-#ͫc(Ki\ 25Ig=|ACg:pFxE JģFrLZ3 ZhKQjPtw;&<[DY,K>`AYOB`e d8h,9KAWR"WL\$t$ ͬ{X@z )Ƥ#0E\(*SP0鐘Smv$+[a f3+e)"7Σ`TZ$)crŔ(=ioAdR\%͎̠Ɍt;aKReeFt2vB26Jh]#5{V"=jgPwzq %pvm)G ]u?N¸hm\9c]ߞ92oهK&bࣿ?s}dy&xy=Ҧ}^/K*%o6}\bFɼfsE̳?]XݿW6=LoW.f ZM'7')7{3$;y* 򔿌I ;8 WսE-[STYSCewB2;xIJ$t\]Y:8Pl̐eu픭tQ/Rk=DVmkqEgwAfn;U [^yb)j,^7!bsG3:3Ml&k&fL\xgf֕a{9l-zmZ3A;avL.:6qw 2 XǾG7żSWUZڬ 5Nz5{W7]";ʙiQa[qR7rPޠ79I:\knd2 9}%mZksP) (a|䙾諆m929B&k>Mn1hAxQWV3m[uMֺQuզ_N\R( j$QR3Q6lvr>D-08Jn0kaY;vbXjZr͠2g&5;n2˂HyruUbWWu壄8- *Pe{5yJ&!D܊B[+x5St&)d=2B9 <8m$ PV#gU\S*U iD"XP?Jtm=ChMEu,P*y&ĭ9xA0F23%uNZFiVIn5Yse9GeiSr*bR;a,Fb=us;4B #'nr2wFa1Ȥ-:+ ۔'rV嬕߀60jJ WSB8c/5}0qx9%z<(;*_ #?'FR TT٣ϓNXL&tRdC";ߩZT0N:vژS /ɋ,"ӺaYך2h-qeZȑ"m>!dg,p En5ŃlqQ>ɡue8?q\`(N %,x#+K\cIIO. FdH'/%葕 CA&MѹAd"  BƂ1i%!R/a>wVҢ8 2 빱1$SIIψ>1ω "89qM) ôig{c=b'2ZM؀`vZa}ܼlX p4]:bhWzgNvUow,+ (9p\;QD>ϼW2^Y@!:އA(kz` y/){*yjuRU_LķBN\ ʼ|tsWSvH'n3J̉LrV)MJ݉xVSlm39jeOWVIUH]ArJRmes>zd Bg5H`FxJ#{T i L>>(!Z%i)Z<e^D𖗛 ex_v߻N|m= | =YEل75&crE2QmHek M*1o!ρwFDT&Xnt_0Xw N0s6Zn;AG8l/US+$T'<)`A KH!MAS1Ub9̩!7{n]X)\^ҏy]tHPF J=Rx-@g<băCx NT-ޝruWb{4g.r6vz *+`^qه%>2%4/oLx%:KtK{w9s+3--K{0lntpx^ǔo^̬Pi\>YXIRl~ރݹ"T6F)C451q(Y"!(O'sA+=诟_F+)iU\LB <ÅH%B$\K&NMQt5DHG4Ϳh>B>\k҉h_x9$87gjǗ{:9JX|o%3XV59il֤xsmIMz[vo?SEC">pI#<&5ch@h"72JυP/l 1$ETQ(J!$HއZ0'zP!ǂA7%bT@ TiXq5Jٍ,,2* C[/f(C>o-qGӛNe'} *;H"25OfUgYq'!(fJhmc"^6 gA`xN%!N h! &Q!3Hdi/% Bxr_v<31)!RB(S<(`)3 $ UG-ౖ :N8mvXTʸ'Q՚턽.P?jj6vgxΕ9Co2BO{L9ks 08(IzM3*sn>>uԽO^v4~̄~E1.8d Q@vtF ŷqtNN8u?).ǿO߾}ͧϟ޼27u30>Ei#a@Q$Qpnu6ZR]c󮹆M%G= [>\~B6 o û~sf{+tWupӄ*šoO8}_/ʲL+8J4q'AD*ιYRۄ@u)U iX\p\+(Rg-ךROmZ 1AxG%I[ܲAxr/)k5"n+i iq)o0 &op&I2?*opV Tʮb ziThs s}v_ q2I%CeEHhj]Qm6cTS]ݖ•aphCr-)4rK5AYe;&g@z&tfN0a} ˁ^wS}M5KY1k Ų[Zcg̭2=Б8!$sd|>(LPNBek>+L<|s;?*8k)ޢJ;H#H%eb͎"[z7Q~q~PR˰Z٨F|קkKO+<Ӷ& PT$VXK-wU-q^A5d}Y_hvٰ٭C3#ʢ\ z˫s&yw>x1q]6 &eF2 ,Z#ՁHObnμ;_>kI=kPq|2o~>~<ߙh߻Kw6=,zxzrYadY7%PKH! 5A)gRX\rT@ dZJMVdLK;q?, >5-V?&3C;ExubP/-~j//q1fǘcuǡ-|uc≙berZZګɪtQRas߭'m yz#FU8>I.Nŷ@17[lǑEҴZ\U-T8^115-0S[+(مOfz3H |2cuIjY{MðM،ygO̱tiլrEGsuJ1K-2*mr!114,%e)F/O:>-:u^}9!ZqBm%rlf9 G 09Y^>1z/Н% ,?b*.s!Z=ub*S[*  銇yUWӋ9 ?6ȺJAN90HEϺ?fmDϺ'uX%@p2)rT8*۲:bI&,QKIh%fG#;8o ^+ֽK>M֡z*!|ZlVq6<(ŲFO.\4Jl MgA Q|"i@mP} h A;[3^vml)Ʃ-yE;D2FmY~ӟ?8XW-2GKΎF !e_2Y6I^('9Idz#IP1"(JJe dt(ap \̘m}8兏^"( u)Ĉ*"gtE㑡0(D(6Ɲ3s\)Cgoc ewK09}ǴX q>4vJ>)a:%L0StJN )a:%L0TS`9nΞX's}WA]I0x5VLfr%)ɼPq$..Lie sjN?6\?Cnn//C|d>O?AͯS>|19^,귵? \ONnq2d6}[t}6ze&b{w*/"kӹ]ߣwY]9p6a`u"ͯmwꝕҎh~ݗK~vuWۚ`n7n&i7A6 2+Q+ʰ)ieMPl%Q( PNZ2$nj@>moqXcWn,QZ}}J\*mzK0Jr1n%:98&<D=uwUemJJk'cqjHivFMU.)"K{J(vQoٰvTq= kE9y֎*7ď3F}U (i V'@D1Dh(1HWDQD}U'GFU UW(1( Hv RH%f,k4)0^eR^.dsRX%ZOZF@"@i:-RkfΊR#-hHua7}=i &Ԣk/_Q o煅- dmL-)Z8;qgHp:AHrY5gn>%]RILѮhU5ŀyBC4QD))2(1o˵D ), ɊLCrr6w쎜C-bjxO!OLC鈋&j!)cR) .p*:LQ6Q`zx&u-5nmyou7xj0r!'2_bb h) fiWsyZ=_yZ<;ORRۣ-rP5E+аP.gۢ)9RfPQ1z_YgHlb2:#C@ Jڤ u(IG̐miF#zK~3ۦguR*#'#^^ŏMJtpE{vƢ?..7_#^a ׺wlҸvw>q(]Mm77ޡkT3`|crG~.UeJjؖ)gL#Msbj0ZWw !e1H+q*v@x rrCڇ igiUwiMYkv/׵p x#:nqwƧayG) DASu*"gtE2vĀ:m\u5s?|4Ø@.CȑqFXE˜:˙X0-HJiK *6JFQ0)^F2d"5 Nii;`Oc.XUC06)gowכ\ =}}۫ Y'퍻f"F:Z7|\zP>h+IhM%E I%Е:&,DR,NzWr.1:ola]%q`T9P#7e+R֝Z{fQi|a+86wfTGK@<9n9H7̓`777׋c 5%6³ HFj@,#E2֬ (*t,m=CBfҪMm]-d0X ڒ|Vv$SʹP{mݽvG-\U1օ+6 &M@&"eL"qK.sT?M6Tncm :D h:U!A:M>%N8V$Z6,tK-혖vvxytQ{Q0N~g}ۖ c5+XDJyZ?uI`sxD@[;ԵAJFܻ$凞1@hγ*lCt)' Pf7Ib0|H`ѥX^R(' N6&m2@$UmtC}QkC628P"Ck VYSCW ob56uNs-ѓEa^SI@-:D1X };l! ]Cu3/x13,検%{+%($N2*!I68M0 jB^Ҍ03cݫj0F#]"(!C̾VhUaUJrf,8QŷY^iChQsPTN2 VС&8!StR[C:dݐ׎?{FٶY@r3f,i$ʼn[m$[-K;@XMEvYUEab2B=Ie:RD#sRsl!/b_rBlt_w;7W~גDže& SW U3Zsϗ="q|+s(Gw_S|ɍ0ǿe̸7].')_gZE#s1_`Fc ;6NB'v[!g` M_̑B*9ϝ[sZwͳ-u>|rv91W{z݋2X%GZfK۟/n/rBjFO#] CvefbPQyI>nf =^ٽNMNQ)Y?luUQ9g2>|S8`atMyigqwȎOcӻ>~:O?S/:(ۙD3;poǛ 4fh䬫b\r͸Ň[. ڮ1Cw3f+?985*f+o (WӜEqgvyRbGPn!u)u~9z#kmٰWCiJ1B.dHzoM NU^Ĭ-SҷS{f{և:[MĘ4F=+ɹl,d8emeQtPTv Ux|Lxպ3 jZW` O"p& U =pA F֣j]-˸K&mqE8_ڞ˭y80l3??~h06 x;(@w= Mˏ6ӄe@]{ǃǧ󼉥{`hS$W&1 n, h)ۄMf}rEp2<.hCL$`!1r62AX'-i?n&q ۽$zKFt.'$^qضa֖4no(|dø8 &>]R*0<>ޛ?Hy2Ne6tCfJ}*)MV ۭ ƾ>֛ U4N%b2N+k }k3 J[#-8#0JkMuZ*Iμl]8V+h9v{?1cm ;~lsKo;]iq=kpiWh"uecUhDh1c?^ -CYI`<'bZHbAA'h52JorB6,>b"BTpT RXjqg4Y7]1r6# pO 裁fn|P2r2·<=kR k|ԹT @eh1I9I"X@[6ȤRʹN^i.M`k!H*7A\r\L\E7m=5L909M gj&l*ay4/g$XMU FUz:[nvf\s2%$mn"n qH NY!^.xL÷3JuՕL07?ɕ\Z?TbZ+􏛜~ޝiCu0v~ogs֙~~4Y0|SQy&,KS/jذJ1*YMfY~0_4r34 )@)xL0Ƥyo:K׷oL)D5u u4Irm v8g <*m4*y筤Rߒ]QheNntZK;/$݆ZOLjUm W6\V*uTmUpUjUm WVzJeft?UtӚ[`RXP` Ǭ,pC6ywVc[Fo{Pztj@>A$΢ FFjPqkn{I2r ( 2L )Ĩ5A ͥA g{6+1H+/y0*5.2H mH҆k4$<\\FRtmmkWAZI9IACU8_5= 1zBEPNQ%6hxlABC[H "}$V(=ZEar1E?CF5(bZJztsZ% sD^] !BRe aas -yu15SQ4N\e3"c6j!P (90hhćIgyHtA66R@k"4E e"X\QJ#(HKjؓ9[`Y<ҝ\ gܩܼxAK>{O6 WyV^cuV`(e%5יb0ye%UU:-;vF5ReMS.!+QPQX :XAZ5XRh&o Q` `QaE@Qc0 Q5rX39U\$Ă2STrft Hk1 ePg/f%O{]qd7_-Vj>rf)f̗][ML!)IHr:$fSz0AK $&(66 3 h1ldܢ'$#ThA)J Đ&mdtSm sެW=VڎՔY|8xDeчeYA ]W]O\%mMZ9Tb#͊s_>e>M.u9ܺ^=.gyL֯f!,eQqݮ[{=/ommRfO;usކͻMvSq? z xΚϚnd)YY%ZyJ?o6$tw(M%˭c03U#g2\@ՙC|Ki[PU / Mf]doԱ Iyp,Ns7ŝy)f>_r^4\Nfdyk.) xMT4Tؔ3u]wj0p2kދ0-;3kʦOzw_^\5yrQ (!p (Y8bT(1*(%c&zڄ9wL|=F=_n\VV(;߫L]3u&rytAjFڣ,IKB2 N)UH΂JDɁ9蜱ji몮V%eXJt[H6HE%sNUh`2o3QZ)/RCA\;9#cѸ!PhPq+}Ӊ#gc*)K7!5"X;:]ry|(>&nPG5ȭ;4}J񉨜:'%ɂ h&C+I3VN=K>y4HL%F3lx |np45Q$Tin6y%2E2rs.'ErqgCuT&NBs8)& H k.[0y(e "61DՈ¿I"^xwJ{ PG9(V6Aa-Vt `3$ɚ[5cʢhm`1:)$AιTT)k`Cc3r6Q+ny}{w_pYeZ{foD6:} 2EjӁFE>6mh/S-w_v ĝ;1ӠdsFEL\>l e]ey*d$I)Z`'Z #ts!PZLmxl}<Bs_r2[Z5Gb> Ci=av/y'I?;:S47H4^*I~?ǣ4cOxG E߰0vz.-g kdz5y O WduѺgB).r3(Rh]g3X|( `$%ヺSPX7Q.ULկfhpͨkpc3rH;:iݼj89۩td8'gd>#6̓6`mؠ=Sbj`5 ߗPP9 bG0RK O ԮfCH*Sd4ΖK,aKN\|T9Pvm4FRdq3r6#Z`aWqW,8X (T ݈7'OO&i:_0? 'ӯɣ vZ8@IK,A!X.ϚƢ bAA^j̀UaMl϶'/ =7%Qɺ5]xW]Aθ㮨 6k7q2P)ոƑ/,傱bFY%+Ƞ em ):--d %,SI$HdXqd7BEgب&.댜x6w;㏻""v#b]wRcVJUsL-=i\Jlf$ `^{I%m%m%b0ꙴ󡺝\0 㶉c[dOHҙDݯ*ۛJٛJջ^U^t*U `b:^),\&Z5j#K~ Kw=h%;pQz(~\zߥ~Ny,NyUT]1 |#9 B'PIx@b,:0 dU MNjR%@]t X: 5e$HhlNb<䝕7@tfs}Nw[ylF4vknd<zAXJI,(t4QKA%d13ւ)M6V$K|6f6гӰlzniAh OP%R2zR )AHц^d/hr`I(>=Z *hb4C ME% gCR+tʂ9G0\-ʪ7lհDt0 1 Z.{CU&C>?W].#喼罽<fpɘIudtl T_PRH~dB`2ep@C̏-gYd o6ƌͨ$X TkpM:\_xA64:z[FJru~Ly#chdJ.E/>5kp:>ŗ\[`%?w5%*r7 4`?[wݿfG~j|w:}7ժrvt䒘 ڮ8FV zF[b͕k+}Xw2k/,Y3-F U;x4<[,7ݜkW]享W#睎۫NjP2R/}1>IԧƯ2Cy Y IT70O>;hvx?cvo~ë~Oo[WO^)%֑0kz/u/߃+կ_Zˮ.[\nsi]|-UQ)qZzQxj" VVqU*\ /dņ'E\s="}-8! PWwgptQ]|PCiJ9@xf0MR >hl 69W xt7oS17?-2Jr.h B@X|m,A]ʉv: x+wt5O|KkMhK|&f+M٫. bƇ\6>|(S ھ=dzmOifns-#-uΆm~ڇVo$c`\1;/< dHJlːL:JˊR.޺e"sAǐ`2iWohp Ƨb|AG)nxG* o"(߽ѯ{ghiF_WnWzLw]c8d zHx t86ԷG6D{0x 5mo<}#WޛJ1R_~׋+ϰxlqNCZ8b݉[V3$Ծ9Q~]kg$::7$(]>h~*v>-C+A]mqdȈK3fb.M١hY(}uq_Aka⮛ yu>~T` xsD˶jnXZ}j+W jo,S& /iV]L+3L2=+&7pUu/pZ W4 \Y3JMm .̋P_^h&=0@^ �fp2>y5Y}FZ~$#j<xT/L3RJ.MJ-_ELٜ-5-/c^n߼m&_ˀy}~Kz^5/ﳑ$/K$O >1{K{e/K$/K$/Kwվ1[7@ )j}ko5}k_׸5}k_تy8UOXJջU}Um7z'HKe)Qjy[w핮b۹Ż3kdTNۺos],OWwpL?6z4tK. D9SKdUEmPbFenvN[i rKIޝ/}z߭θ|V.ٌ!G*jI1#"օyIP@HJj4@(to4,'ֲT0XCiZRq,tTB餉l^Tdz]6q؜x؜w NOڡk2^z|M*̓6>8 .r~lÙe<,X{M01w7Gd }#IP)eސ J1YP"/VN8[r.17e0C%m.>(R00I+H13r6#Z`aWqW,TcApIQiJ >'g4N/MǟǓWQ MI;- %d eVgMEcQFt /CBf@Q &`g[Лd]Lj9;GyŸ;1jCڽ{+S ba8cZȑ"6W0 ng0a hkc^Nsö^[V'nfWU,Ve 9⋲^` 3- & 07[ՇYicYȄ 9%&!ZQp$E|HY%ɨF}5rڨeF5&V51mlcT 9 2虐B n$ɒ&"=b,x-k!~a.:[}jY/^/z*kEo]@(YyV5euE0ކ}եݷUk.)(bmtӽ!ʫ;Rx$3=N4rݑ+>Z-q=Jٱ toWgkuYq|zc5g'#m8HWIO(2[|]ݽ;h|v@wY{pjk bc}f)&L8GS6P)O %XJ3?!qM|v3|żZb_\y-;ro*[ge4:RbZG,}BV&EKNxq;)!*5"v5;) QC @4Xubq*ATE N:ASf(b:rv뼈㫋^X N :(nd^0Nt҂4"bw[N˞Q] v\eǬb.Z͜p(RҖVyGd Ԯ JLP)rXNH&"ˁkNcڂO>&ցRpX*yWuSf \9K'Hkk&p;HSW07ʶDsŒc$M2ܸq>FE&>⿿.>\fwjWW&K5y?>"9<c'ő'ܒ1P|LRN\3C3e< ^ )pUsodqV̨y,'X\MMw4!Tp+0 `덅ӏx7"iffS_^>>]TIzjI+7S+rGSpxc-%lauo{ԴL]X~yW漹v=&lS3&>tn[풣XmM:'Go5#)֎jHX;2'AL8u`Ų_.f=Y9lz8c֎*QgnuϪ9SQ'eSg2RV>W S~W]Gz:\lTOϴhz{Ď_~(?݇~x wfcͳ!M¯"@ghzC+4lhiamOLj#xMɠZ ~Q~8%@n^UoV=ŪIV񰀮Gif_!I/tg1j*8"^ZXu~SBmhi7R C)yV2WdкUb6E%V~M6KzZuNZPziLUECX$2Yhmd^3J@HK%jL\ MFXV6v&xs{ݕxbnr gl4ѕMtAFͣ^-`U}qB.wٳJP.VJ6J1 W[[0U ϬW:nu EL c84&qp,t( 琣Ԏ8׈pUtG&g;NzJ# v5(vAE UOc-WSG}{?`[SW|*ﲨ )R_0[;Y58- G/kw=Xlߜ(0_3] &xZDIk602`܆>d3N mb:2}fʛv>d q4 &q< ˯?Y@闒َ.ܹz妓r dcJ`0ē`΁dDI ) S"{>6,<zOOy˟&nYҨսբE-[STYji>UC^.#5%sd eK:=`BB17,5/_ְWNYJ%m< mNdKm:tqt\xypa!ӨY[Ɂi*ab&ą&Y>orl؁tFVMDBLN1n;L;6i&*e2e&2,n8YOvb]hk_l ó "xw|5(j8Nx[r eB!s]j* 3ـܧ ;g#˜3x gbh(ϑ-0!f滔5طP}y>.l꼮tpK?Cj/,0ctɆFT*(|HdEǫvE/~|/Z*Ŝt%\rMz$Klp80LQ2LQwseȱȌ4"|8i\(g9!Z4YJ$<&[__] ]Gk4oC0,ZS 8 i,*2Et BڕČ41[8Iq ͬI<XtsB FƤ'0u{tIn]UWYP+cC6Xj0`3]b$ k L>čNP%[ݔ`9a1[qk}HΌ3,-3Go:NEWVyΠx$w o58;M̝~`N)J + [`9;%O}ҍ%`lUŒn==&Dz %t^F>RF@ Eh*wnfAr٤n^ĭVy8򐸊:yR%B*H؃N DKn59NrRcl{ĥyPpfJxnSHK$ϣ>A\0v^h `ݗALWWsx9LzV@@W2ǩTmr1|Uͫ^tit坟4C^^G4'+܎y{GևvyT䢦L3IQ&"g*|952{z_qtNEʪI{$iz8o;=y-кWF-oH͝_.nφ5;ownnǣToq04DLjo@m})pOč'AW=[tE'|ѲN dI Ra\qi@-CrBF][SI+8v~!^gf"fgv|B’0f6濟nI4i"ATeUe~itkDx(se,w˛ٚ]ՙѴÆxe^n(2 HM)DjAO:[$ksӇy 0/aNr3ln%9 ,Յ6#+6Z3tQ#k?>Y: 8J5qYh*XpPT `>`'<]jssʍ"VFudXDc0hJ))$͢e}5rVsQh{4kt;_3Žk`,;Tҭ>~0."%?E10ss 8;|wN{4`H-U<(#IgPco +R}4 !؊Aofʖ)iS[gJv׼ m6Gb>IŘQND(4j ⻞(QY-#4z19(=Dܟl\-ZLخU^ \1\BнDڛ@}JT.cK+αW`? +1JԲDeeR2hK5p@ /.k`~5ptaPp_ψ)صh' .fF6;BVAAU0qVzo}n+Ko{i܁_zc*ygnUPքl oKkQJ' ixW&ˡZ۪Ř LJ6C^jj Â#S<`-:2|UOOqXtgUhBA uӮ}{A%諃0eϗZ`Y{Qr)׮X- D& ^~_S'aaQ/>I̗_B+\!q7_ه'Qv #${?ɿ;4m w=(?s 7F{M @#H~_eFz3Iv~y3?}&e]Zi+Oa!٤F f=%!RC)Uc)elh$z0)fS 1[&U{,~;k}E>ג)E 0>;S>}RU+@>ng=7>jF%iۍߤv 8(_ !.j0pW2oogPucMz/ЊmZ]i1vօl_vJ:bf.Zf'3nfZ;`&SRT[o &uD9:>imw2c Dz_&šM:Zs߬;2fۧȶGe] Á kpiFC٫3v\k#ZЅ%?7YZoGJpn@"?h8EJջ&ZR+]0\oaw&Md PZdi8[wƌ3Vf] [E}QkNGB r UQM`_sp&`J 9X1іoS`_F[wM:{Ep*>ٟ݀D.%D%^bГ$Z{r˰rIKg' rt$9aǜ6C6NW=fg7<$ǸU#qH T70=D0{Ӊ\77 :L'*uwE´L!.bٳx;:A:Ɔ2 lP6tdΆ<Ǯ9' vqyOSwSya`r \xlwy M'xizߨxȏ)F_7sl½Bgva0Je`#͜c97Yd]w-V_lckc⚊}r<%eho,D-yRw h=+ &eG{) Nõ9\i0\DDV+=OL&5c(뜅j '&7!*N򃬒V!py&)x/Cч0q,UHߤXG0@߳7&Să"p`u8OB)uy/G`~:5ǹpɗ0p𾔯yV{?h&>xJ7%Mh$M͒s7 s_,&~)&l]&ՠ);gZHX_klhIuv`\j|mCU*鉘J#68R$#Z{1[,} *[ .)u];0_硲Ϻ^'6d~Xغ>5~Pako@[rm> > ׃od^v sߖ{6wm@ vfoU<ݜVsjfW%un+JT62g]:Qˆ j5;r϶e+D(vP[ JH)h;ekeK?Qٷ*l '+fu1@ yTYWb`k% "_2XH>&LOV` 1rc1|Q Ƶ]5rVL wǼc| ,R4@YAYt2-nO]JdL>R{V":ot0Q(@*Xs&UqCtR1L&9P:mEx۷ w4MXh1W_YQxGc,o;f}#gP!Yx(kPg|K^凌vd\(tBx3,@=d:FЉ"*e:j$%K~q߀y>bNwEd&5tڪ]~,gK-UQAvM7:nKnby?Z ,R]} ݜ<,C0!B*Caj36@#yeĠchnDpqOk|qɺ[Eb/ommakܱ.P[ymoƓpQ+{q ~XF!,eD * |pHV=}>8j5~􁏢W2D@Ks,O6۫ӾFGwK8T3*Dsj@ _ꃶLw0E!5I[G t)WUkB Oc*SeZqb8z|i8>JKAi)*J@rZa54ǫ.yU7óe 6yM|<t2GtGJ (XVF;?˔B ::|UzO"HIE$s2JaQi]rɃVp00 (O Xf'{< [d."hwzGXFc)2a+5ՂNEdZxAa .D'5kXey?*&ĬИGG)4(FGs.!RKM72X :+M'H :'><(*"оAQDY  Ō7"kXSy3eX!ڹ SDRԯ3{B$\:H0Ɇׅ*0 $q겇|J/%a_^'8:xr=)\]E_\DEcvϘLqrdCHtD0°LŒcx.lp<8 }OY 8+: fCܙqTa1 r LHϝN?&. |v~aS.M$Y}=m^DGˣG9QSy%=LJ[ECTbz 7:U:cvgLӷt*'?\MΧT0sa;;/۹8]MT+'f0~f1=]uCa*Aa0,0 f0bɇlǷ}.zeW:d]uc^P4d$ú}k8**^b!ΆF1&Kzqz߽ۓ~{s_|SLONv`&\Fu$1 oDN~k5mu ͺ5Z97W6\C!v%{k!@~~ٻrWXzVpiݮra&ne35Ojq~6t:ETT1&CV IWrX#GA6pgZ)iu+$,$bV1zJIwX'@ϋ-8:J/㰴y]>_0p#]L2#F58 @2_PWM hIjeS449ؙˡK͉饺qhch,ؕ\{^ GpHg 6\ N3䪴H]#Ӆ AvZ1MiٔT1&ڑJBCLVH$PbO SVPJ(Q[TR7Iϖ3rlI +r>._q~U~sw4κE蛗3M}l{ciw7 qDGAGRHtzIYX` "tW˳ 3lECWTVh:r>?~%Xe41Ⱥ37{og{R:B=A* Fc'>~`Y{wz~yO,5uCFSKfo}51I}"IjfNI5(2E/^H)(Luh((#A Yh%n)PA'kAcEK )# **{Yb6 jL-rza/W駅_7oͮc=t,__}eSvi6?HbyFFjzR}=ew)J]-sS z;bl&)->kDJoWX)e՞D:j3q6k8⪬b<;Xz~Jр F=ṢH+FrOB#3'Ӽ:Z$DWZ; 6;(;C VQ&s|Y39yrchd9J0RNA4J"CQmEN 6ܳ8ߔsHhs0)=ˆ6ynz o_uBmv=+ξg1'Vwtǥݼ;g*)!DiXߛ}\W{#N:[>]smV /|ݢg=h;W7Ssemv=^v[%}_.cF|O9 ŧ;|5/!SGST;d<W穨N&Z%=7!~3ͮmpIgI5Πz5C&k_rfN_w?xoMߍuٕ5> .XղzX~i HhCIN>= sǖz3 F:(Ɣ"LJ*j,ê5^4SԪx>zSQA*xNOO3av񷹺kUKln]-QZ|FZўO2leG|D3Gs5t^YFxaXJ D?/L+We.%5l> \o/dն:(u`ܐ܃-ƋyXh!sXZc{t2,b{Ľ,stcn3t9^- >H ڒrmˤs Vl>M<[U|i ,UdD_E-rzr~鸁r`(j68<68i7rؑuCJKE@a8J09B?^%z6ógq#&"7;F7#|3-l4PI_SGY>nQkNVd: d6İ|EZR. we=}k;QV]IyVjiَM1~Za 6O( %r" 6O;ҎBM$()gDad lO82ai coO!QQND9##EVJ;dWE@1Y%/iPƔgMR GSH]!h#"0{– sB{.8 ")3%qUٜs0$ac8 Qbf *ru %X*tB*'_L2 2c;<y7c%c1c--BRɐbIZ}AFvg\z% S HǦyzuڛU:GM^7=mZv`oYkONgWDMOChmJ:MlkA6{IhƠo3NA /ڧh3HbBeM$(o L6ٶYNxEI"F42$i"1d J)$`R =xKF̲[#rbo_A ˳;Wx=BL?ǟ[G|d I0awÄWWnkg}U0hv@T]bǻ~h~]?pW[?^L??M`.Oݿzv=OW ~|>WOiX/zɬ,YIuY Ͽg̎w o!1 F=sv?y'twg63'{mXXv4 7Wn,jtr?\]Ogs`O̙S_[4)l&\m'<'/7^N/>]Npߎ Cst>umzWb[x @:`yeg&XOYճ EL f#3cbt+5B)c^tƘz6$1GT@DA[TNҚ-L͞5[~%I #F謫KK'Xt[n1O[_>=xĪrgfeOUS#vA^;AtlWsWĞ"4 6TǑQ:gM)LI`B}P6{ktV:R DLm.ȫ9&[Ŭ5{ AJ9d uZ֩o V:$n4rZJ,xWJ΀X?,Ez(,TX*(ijң-1X R,䂥XBJ6Ls?p 8K$evE:S 8)JIm&Ք)+xzE), 9K҉z_fHhKt1c Y O5gK=+Pe؍5 +pZ)X~GP"5"K&K !D2KФGh^_-9L L2% J^!eWBycO<.x=Ű#xV3Ά4' ;*+<T Un x2Jk٘kƶăeШ(f^? G1hLĎјthLK 7#miD!/0BQJ:wYd6eٓ)ݎv=] =5"*j|K[;n!` KPҨWpQdRP>8Gf7GCc]< vMY]X?;^".En/@Nl'Rd,#-^^UN֬g/yٹЀvA^.:QwQf;1fv'Pܲ$^cdaWh+ع쑊$(sA-`%m@2dgr$6!>BC蚯lچ/">#N{j4~)x%y`Ep>{>"ysioV)o9y@J/U\Ui \U)q( H}Hq=⺃1WUZrC7W,b4Wo\tʹ2W,q/IJUhޠ$P 5tsU=tU%b)I\As5v]6o'<'='&?9Z-<-T`"; =W&L/Egd'Dʞ[Pd i&-$ nYŵPt LhߠF<dFZy(JgWD;7hzOS//..'?n19WV?^h_:Ÿuܢn1O9#VwGR&D'r; B1Egx(:\w/(.'?+OJ JHBPa] _">HZ,RtPwI}kӴ{_Ou]ΧwHYw@waOPmӅ 26唍{> 'B(NűD_ Dօ.4ue HIgqkfD*{0xݡ*<%Jia+C aPwmV$a<^Vd: |+&Þ¿  cA$EXҲ)*s_d02Ȣf1]ډP EYH; 11N9# #} ao|NJ6CLwɥ//NP57=Bbщ"wrd5ΛJ;$@b&d Ci? *9IQ}cd\!'yIJuV|Ԕh R AĨOkm#GJ#_ض~sv,m`Q5%$'b[m$۴ q"*Y/L/pl[ M|CS};4qD Nj5J5x $ÌN% \vʭ_T[ƨg_2^yCI,K2A%L[b;f#Ty0DAHƦ&`ۖ]iۥ:}h+= Οo:nx@j,)_Ԇr =SeUP" j1Z1jv"XZ8]GC-סrMB #:Q&D 0H6 &M.1QEŋlrFjL0OHeD=2:pBF-q'& ڷz:?;nAZύ WoFq>fAl `usU1LWDǫg.Wu<:8 XpFiM2PjYk<ƾL+cPPaGPVhD,96@gek1<9I:#NjEr.TO|&ReeyY츄yQӈ` Lm BG#ytsRsH ͻǒx,DOȂA(O i&6HbkW&fZ@R>罥BehuxK ѿ^хqg1rv(|zy>a||ۏƮ:^':\3٧9k53n麵»w=97}%Ç]5]H,<5^@hs׹׏fvdߐ2/Tkl?7w9mtf{䍫t-klYܺ{VWwJ:Wsޅ-5[:n}h#V1 K55-s}ئ)}?O'ڡqW<7&\+Dq|Yr੿POo{]Dy4?.ƍJ8]eWWNqV71a tOb5WO'W|/ݘѷ !϶y=,b+{a~}D\UKHis5q}[jݝc7.h, ov6f;ek~-9&*0+lAnRnАH|FYj֜I%Δt Sg +!7rS@;?íhV;I?gǵA.7{59 '7҄_E=yfrvKL y_?0Qڛl4v] tig';dZ LDt"%ţ$Pj _rvx )+d:q9E MIʎKO<c؈'L@Ikƙ"#5"_I.h- MtFY7(gya9+FΎr՟߀uQ sH@)-9wf*h&$P͒NxDH[>@Qwy",Q&oXtXp@R;U j>?o6#X%Po *]3%N(@AѹAqQ{yW'y oݷ*ގw1\dUZʸhBW\&Tv,85H[v1:gy)CAKKRϧOU#1xn\gCUGG9)qt,O dE-V }Z8k7+K2Vbv2_ F'^TKM_ J4j9T]'>0MQWY]q*pF7FZ'DV'#À1'&;}*|05ucB 8q#pIѴ-Mђ(J"gǪ4K>ߧsbOM{^˵;eJD"k@ x"5%g' R/I:BCH:~PӋYP]Ĵcy<`Z&BE@PD2=Ck4ETN(Z~^wWyl ưK|]&AW^R9qP-J$HAw6ڔyv9?iEWdA/=E,3R`1D@*Ce2#J 'Zi_* V]̗pEMȨXͤ9Z~_qinh/Fqja}@o.sDs  p:s, s_̬ϫ!I{J(![YבljPp&sk5 aP|XSlg`6G%xvnu?Q-s@7*wE֓+Kr%.(5qQ"3E~5]%\7om&zp~v~Qq-Aةnjkʯ@[q&GWP?TԶsmu.k|w1?i?x n#0tI̙2bmZە@oOx0~#IR;G:nvUfya|Rt '2*Ad$JDk;IYaoRap:=kyz/"518.Ґ\"Y8DD 8(3";{u!H1F.WxƤs\50=O"G;!GxK< v[ \a^Hh^ m9Ϊں\>%B!{WJwYHo'GءMgr'j@s ˆXbPGomTDɨh߹[1(P2arqʷ@Wsl grPU׿hAMCqɴ+lfxx˹+8=>V%{rq.{퀴 iiIi]{)k??M×O计ޮ Qx9sgzʍ_Krť 3 E UZGSE{/([[ˡ:쪏Ud>l= /Tѓ߬} `N}IR2XK6{[i[L2&lW'2.Rz9ou㣗7Q7myW/ْ1%h1sm3+isP`r,2:ݍν`y6ӣ%5o6y9w ˬbo7=y{k7l1-"i^Wì"M(L=hBMI(&Kx CQF *9=ySk7ۼ.%GLa8k S@@b.}Cw`Kn|_6%n)WcٿD||[vӷ`zJ?]#UЫL*wR n{*^TAWGGc~?2D)Gbk2 d2 @8D*HfHnyI*c:ۢn+L2TJ!%t!Őǡ s^+A}Ȯha{"5Eժo^"L uB(eHQE W$2R` Ѝ4ᵏMBmI~ޝ:OPX"&*Y։"1bѲhHcP Dl-LxYC*LPK;M< j3q6<8(*[<*FfzȻ-ݖRTOyv9$C띏o5)48X~{dKS?A-Al$5R6HMtdZxDBdM w$CeqXBR })YZ&JYpZ#c3q#c;_ҌbacvU:>+^ITzKrfoO.'|COG7D#f6cb"(#X=EY%N-bk 4`/"BJCe :8.. A%S֔Yoۂʹ㱨=Ul)TZF[wxѺ22(r(:MbNH=C EgMA*R>[vc;bPdA"E b3x,"BcDWiK5GbʥDz(Jk%br>sjT1 M1x)u1ВIT80e?%&3Gȶk>ncDl&,?ĸ8ƵM{ʹ䱸hqqQ"z 2mʢ (7TqN2B,u0;.>.n 6ӎmqx vp_9MȍHu| 5fiӜ z+%k+-~{yw1ʂ\I~`yiQECe@!CPÓR,^xu 2Ah %*(%m`u2m3+w Ho |Ӻ/dlyX:Q SڗtQyWjw^ű{G頕|sc,X0[8IaMCQ {SPU7R?v*xu^](xk;n;Qucm,ݥz%od~TRS tO<-םR;[dS tJN)):@RS &}NRS tJN))ޠW@؆`qTB)Pb):;N)):@RS tJN)):@e:@RS tJN)):@GN)):@):@;@RS tJN)3ښ}ݑiT3?'F0!hk 4Ș+{('=>gJG)Z<"]zϔim") ]3q6ּ-Ԫ'~d^Zygq^/zQegT?Mvٟ ^|$6\ΎcH' 0Yz)ʬJ @ [tLS*&z\. ~CC_Lj~@]0?\EE[YR1dp^)'&0JHR M  Ŋ{@43-yv2wcee&dC-25uwh[ XXdG: )`uWQqRK&iJ*Sœ c`Q.R,\֐jkGBqPY$K!`)%!b$ӬZ$8 glf8ʂH:qY8Okw%O(}ZDU|)/W{qNw?i|xf&w"H4CިW3uk8_/x85g]e;[͚mFq]ٓ8?M0+ao2V빽QĪvx|5c:d$#ґ<<ѲaasSY~fCD:>b`ź/=st4:_r-Uգ.^r٨JaG3ZHذ,׮/&_erdvHqǸS9+CS:Fdz{ytzO?|ˏ?ӿ?}ϟ>㧟W? SEǓ >F[ =l2o `\Zqן4嫵uj@Zo_F\/gӟztjrTZUnG:tw> Jݞ} c:%w6X.bV>>!r` P'X4A"U*5Ifc,"4Hm&Άg).6>B>ƣds`xzhz _$.O;+tyLJG/W[(($rրAʤ<VzDV:8㕶F^$.ոCWWU[ )#%\"AXQeܽODh 2`-1o8&*$]`֪T&L .8$LF?OGež|>ܬj'wpoSKTOWx/K˰uJt&)$oī[^W!$R6srv~zBOG<(c1̈C\dC, :mxijM̠( jMBvi<]y\UAcc^ 3e$0 &De+ɿhWh[}3[cScZ,MCu MNQn H\FR 4Ni:Аݗ&lYT}~M>yR* 06uF^/ZqggI{fs/޷2ڧ4XH19eؚJٓ%(.dH+JҹwAf4ضd J)$`Rl{F:,▗#v3/?^Vn7E[d{2q.u;؛̢Ѧ3ګ{WHW6U8__| k%e#u Xԍ#4 fF$F30V>دk(SBBe̫)euI2d{su{s{>X{5yY%_]~Xnr~i+HdG$J,"&0s0lI(.OfX}6)vz3IiYgDhݻgQ3!ϷFp$S)LeCI9>(5:+)Ĩ":* Ā:DcV\/u] !H)Tf` %صMʡ}qNb\x|F~ЖE{ u7#خfqG|Ȯʧ:VU2c*8H KTbB W-w";*)kv8G6ƚx~7CSW78IЫyX?}ܛ%M{z_5d0+Dkp`e_u]#NP/x>K cϙ7?-K%o~# )kx>$ ճL6z\E/}߿ih}^?_1OCp /Y/ZdVhFV2 92z֎7ଣ8:^DZw#5V_23^Ⱦe=>w./Pr1#?˫sk`2㡣rÒu!8xgm,Fu%1Yĝؗ`>ڞ<5f] >l·Gj.F$5*Sh lѲ1$c Y*θ7ҘT޿r  &h f#$bb@LXE{rYXcN@fC#i*鼩 /mo:R_`c#݌nJkܐ5b0š,T}_ `(pf@tŀ- *\BWmQt#]$]ȅ psOgN'mX%h![SlN96Xɿ 5߽/&l>={?dǖךCN]߽{- }m\]NOۆV'sk!kBUūdaZc~;Wpԟ?7cC5$R)2j2tGlѮjVב %7I-$7u*u5&ߦ|JoNR˿Q*x G:X@*Z<ن|*M-0K $ I,gv,C͢g@`9Iٓ',""{ӬChʪ-RCh)o2)IVE'^6%3^6YBy;ݨAnwrJo:r0\P<zU'G:C+Lz8OQC֩)b ?H# ]Tf7朤9v9)+Z?⾷mםϿEz$ z-i{}'m wIGۡ=v(I#RKF ]U$BW*J #] ]) ԀDtWtU7ho]UNtutE=$Q Zq+Z2}+F6Ktu0tŶ ]U*\3X>tB葮 V *`CW w3ntUw(Z3 gᢐCVb骢T#]"]q iÎǠ!'HSzh+vOi4+\B-~EG>f]r8$b`…8-~ήt84vHK |Ⱥp`誢u K IWdhcY 1{T{-&05`kn%ۢUR?F^PQ=?W׎KKT xK%CVUEiHWHWZw6րCW UE tUQJ#b@tU ]1mھ^MtRt=BSb p0tUJ7h* Rp]I+l ]Uj0S-*JF:@H{Bp`v/`ۢb\,h"X14u.mI3"뇯ڵ9Y>nf? xlFTRUqnY7V ^ŐqG€< g6<[uwO1 =9bstcM'Ρ#upe_}ܜT۔ցVߣ|ɾu⬖V@c5[ hܔ%Qh󓯟jUuWS7jZϺb{돟gWy}jԊ?.n%E:ŧwO="'XPiLm:]v/Ss_?&Φ7F@^ W@25KTH:샲[ґB*jT21AN?2Ws&OoE ^=0Z sLW֙Bv Pe )1-j '`6Zg8U(j,?u7drJ܎/Y =#VRSGίzuvMyydB+Z t~z1{S>sx|F>_7F+ilxLan_?yV4/Y~+~'?{s: PAq_Os`KPAҪDhK x3TH4FRNFx!6ۙ6l̦qigo.Dh,ZUI1R]D1|DhJ:J,zDzEVdAE (R.#ЖbrFgmZxYgpg^ Y+' *~ ]z `4Rх⒱|DP"d%g%₇"D|%hң t_#xz/YL dV+GK=wuF|%)6;'Ԋ}=Iޯ^K^Ke_K~ͮ^sӳݖ+X/J΁J}QyÁwRFi- f%,bC8!yx"m e}u/'*)o.,pLbcci^c9P>~~0QҢ +S{{'T*q"+)gOvԶfD,_y_3UTʧ`M;p`,(:. NdG8޵#"m0,r%؝d M`Ţ,yԲ0e[m+rV%t"_E0E݉GcN/4u!&U.`?GjI\-^Dӻt?nRݼ͵Kn$#gcz$0gFg1,M`Prq.0F{ {R'ޢ+/LʼgPCǓi]1HT>2:. Mrk8t'i*=}F6I< 4fIYMS2ϴ^?sPr{ ycNJyЕqIҡa@J 4duPM%T*XE2}F,*ߞon ? +R8Qd̍;S)mX3`uͺz(cfnh3aUBFeTq.H\IP 91ܕ1)Z:ZGYqU6Ji 8 dYt:#~}u}Ұo^Hq04IU~l:: lJJ[vI*h+9Do\3:k9e>jNF3Hٻ= 4;bE$BGU)E+FC@5 M0m+nluU<5:_lgxW86S񭐓غA+b>ĔRhѵM>WwBpS457ǚ]WdP?9 8c2ևy'y LvFYrtv$ >$9sLNyƍ;+sjymm5B#vR|RAh8F.q؛s0^MHجBz׿M>MÏŰeY q+`-)_K$EmXɍ^mҵfLLҢ\X\,OHE, cVcK'i(^HbR)Q l&ߎ,GM :+rq4LK.hθcS-;rw.&gYƒqQ{xrhUfFXr4@˴ntfml4W!2䊼hh @p$EBLY'T ]팜aԏq<',q[4bgFTkDӈ;x^W[k 䠉Xɕ*1u@B`[:ՈqƌUVY.U"=SIn%: @%c9kċcWI/LAW\^EӋ;xgNh,B),'r YiQуˇ4.iC/n>;6Շc}h6Ӈgq'xȵ n2jTQ4Ik$)2ú?h.EpcA3ҰfDIc TPʅYBiwQw=jJ)=_3&w5R05P~!OpjW5yNR<ӒA o!ff*'Prx-S10eYٚNhr@4qJF:uE)5B{K>&yNg䬙mX.saFG-Zጌe]AK>sޡQ)1LwNb r 3eŘu .tI#sI%^  _xCK-<,Ma kINX6[UPyD @@#u/Ȱ}<6 E{CY4&`kºVv]VLG5I Kotji&AV >\Jcar۟4#H*ȗ0/<+=D qXFXop:X2JBa'H &VNZdODz@2,7Gyn4z1B ɪV!ֻ* 7 _Kޞҹ&Bx/ijV| oDKD'Es6f#i޿Y qPb0uC.;M)wޮ"O2wÙ,#g;e{GfK u'qpWOH W{$]D[`jr0~An a.ᤐϹB !^4x; "f5`rr|4jIɠރ.˫71Vj8Gkc=ݛ>^Œ&\ƫepv bN§0GӹjnEj&Ma_*Zz\Nں(2'AL85:fbr>UYګ`w:d[mc5^s)kbP>I^=M)n1jsqGxOy77pf`Pj IAHnu_\kͻw-ZkgFv9~o`WoqhI٧WCx9_kmVʢӕ$_@WQK$'\V|`DC_oQeԨGb3 ~SV6?N::QEѾQJp{59?d@'fT2PX m 2xo1s:0y[xݟEDGQs!D~o  &.I%+KNv(x9Jr坯ث.IM+p.yr=ZGC> p%:I)yEBFYY% T>\C&LUq ^gŷ*k(Ջ< ]9Zg<˲3FoO$($#BtqWR-l 6S~ [W7w.]alwH빐2S=]/q*o 3mp@‹i2xy募P?4}#Ɨ"X)4"sIw6Ƈk~x<.s8 *5WNH. DYl䬱YΗS|3pBXB~d:")* =z?;+toI^L˟V]Y_niq~^j1li;?7r_nok39=zzWf*Φ~rjQ^l7.Ggy,Y _^^+h5ju>{bKJ̬3RdLX N;!Ӓ7:AӒ܃bIx cahY2<'+] ˑ3l6Dz|dЀ9ji[O%ܧ3W[0:bk A]9I'G۵%bjnT"Az;}GZ(穹j)pᓧO\4RFRt: @(`1iI2X4CNwl~ Ko6&3#$>S~ ^Ƭpc<7.yf . C~hXAR"PEJ,Z98h. )=֋Ĥd+OMh.T148׊ ^H ShNLBFJh? t'Ѷe5u}YGtq-n:z&tDQ{Z,[kR T5IljҢE$KZU s lGJг}9@L$"] ST^t{SPҊhdX6x%vESFt$xѳYFѠ[ZۯA%eFL`3| :9X]ȕh9He Av=B & <q3@bD`L 9/B@VZ+͢N3Ȃ.w9@Et–D#9gN8C^` Y'nCY!/ lۼ_Ü3b3$hk#KI~e[-v˖QSd],~U-d,xN*ǢŴ1^FBoId|Bnx }f͌p%K;=6Y. '@yi֠eƠB@Lno5ˑ6>M ԟJ\0+#l*2!Kޅɲa[6延WVlO6GҾ^V PʀNkmBc$4)P2߇l%pd v|w.9\#pVW?cP(%"XY9oue&Caf=x+2qѧq+a4@^RF p/9Ȥ`XD`.)a IvPt`TF%083 C̷ c[#gG+8T]G8[@(9]Ov{nnG$lf%ydM{q oErg9H咍"(R=% J0]"Lخ?K)k]$=JH)si|2n:IZ"B,m}Y_>Fti|j7B-7! :EKn+:4K'Y^_iH`R]KV1De,4Nsܓ;Vl_#?&ӚEitlg"76)7OQu u7NPZ/Yr'[:]٬ qjRˊZVsnݽM{;Eޡ畖(-n6y'Mc7ܾnxɽ&wfeoɚ/ؐnsw׿ϢϤChnnReL`m^MтBqhAC/ZP\)-vxW?եEm>'q5oMRzEq} E:<ՉpredϢbށ ^In3 pw8Y~D>.f_/߭$Y$m.3x[1} m(0˵MLYJekI.BKnk- +]au4X.L5ٌoz! v[}&CƬ}C+OY?gV|$Z:?}%m3/ vyO>؅aoكS?~ ڿ.ၫ!l%KB1ޗ *cg#KVh*wvpC7ydb qs tn)cv[Tnκ\Nx_yrC?8gRNxd)Ipen@ i VIHs6C'^@0H $R !gJuO̎k,z)r R"iEҠJ '8_Xr]/LTBs噀$א5myEL;3+is hx0ެEÞ Fԃ@Qۚ]?n'pw;-6xFsg=сm;w>gYn q"fC0)aN]iZdkX 1ٞID[$nn9,h3]0l)y=4\ZD4,NM;-;6{u2cyPXpcբN`p"_|,(.Y:#_qe(ntqzB &HOlhAfD2 :u89{Mkdk<wN2LZ0)9rJiC.9sVҘrVja#L_TzS>&8xk,ʼƭjl]/ ..c)ibf) 1Su5[qj~hx-z t>A%l">rޡQ #Ӊ;'1$JdRvr1f l+F@[ՓF IXZIY'*ј+֜W:ygKThhJvV*A8+<:7Fۺ~T,<+[T(Kot,U0$6"aNe-RMca"UWє 9ғ3 Ϥ9')e8F,#^;C ( Fjw#=ILPQ{$"QQn SD&xFPfPFq ~3Rw$tΒGx/^03+>6W"r0HFe3L.B¿|Wl0f7,ԶLj~Gvy;dMJ1u)8g;:\0y\M$ϏpOڋrF%F. tRדKrNjpJru~@jKG]ZWVjIw77Uf-䵲D9ZѐMMK͛tV(5˸e?ni޺7k^̌\ٓ_N[gQN9$<|z6[[ovPD6黳0\.oc #iGRzHMÈajfYFjE+XVzГ1Y8`G]LiԦͨr=R>SrB+Ehv7qOQ :g{qF}~' wޝ{:uD+0DQ B><@zCk4bh.C6g=_5)7{vgkC?IKzqZKSsVbuPB_!i2z'4oGsE# 8 d=NQG&|(-k%N2פй40؄"E`sY\d0&us[2ϗC:)%F)S%앐tdhAwAy\-{\Gz\=z};.>LkpU2>LJɕBm0-^ 鐖&9p%:iháIFϬRJo* W5dJ.A]}GVJYHo=Gg[cV m@51物3eD` I x%P&0vX]}J4@-#r@ )+59nݻ}ӫ?GWDSh_gWv+ݱUOoBMs{G-uPO ԗm_+`m՚FX͇m^iiMސ֭r Lʕ 2 KЁ-9?f'~/}>)9e`x+KW#glk$GsԦ l%dxtX>Əh1wƵwᮍi|qy~hr1dﳍAtRJF1i‘uJyI]mv6} {w糩_W<_7- >lUnn[|7/k ߛ~C;b2BgUJQ)|4-(::r: ʕJWt* ]h+{uY3֫7(3`Kha& JO`iOL(Kp3nY%cKq_V Xg@fy^z~(bNԳ^1:;ُ* ZA;tʤ?ħUVq}0k) F+Zhm[ӿ-3Dsp<;*Q"gN O6Km7E FgQ(9!*.#sE"Edm4yD ^Gc0-[[#gG?"pW 9TY[-! :9:T>dɳT\CFEJ7٧RLHP[מs8_A] H:osi i?]n,PT,X~RyUCb4mrB%zޠ ɒNotXIg-JFl1b60AKN[> ΋uFV!j[2FvViOeY(:YB/˂ngk3p$Mş5`0'ӯ\bW:!i'&"SIsɂb(##3R* #x1+Y2eٻ8ndWtE"ŗcq݋5A%i-i3b-e4(-uK4]f"A!٘RRF`-l۱]\}FoJED#v3qF0?NḺvq_VQ[M=1K K"}1gO 0`%KaR NKd DʖIYD\4LszN"Q-Qc+ULH!>AZ]lI3}Qĺ[cDl&_u`\\ gi,/.Ƹ'\phu:!R2!dt jFn\rǂf}4Cs?<|uFnUbqELޏ+{R{WP6%"7?UaW/?&v'?׳Cq-_$0 >~P '?ay@"^H҂Ǒ mOn_EUym>~gm7,z&J%3tc7tnP2V@JL]PdF56 W0 ,ȈomOD/Y6|givXRծdJƑ<_Ph JxB/ TTH:EQ|"" ,=%C+Cdd`3FCrrx8[ N%Si4 FLщr6YPؖȠؐ5DgC 2ix4 m`&"z~g0%F 9G!y >;W^PWtZiBPe.&xv?xȉNRAiΧz*rǥcu,8\LU`x 2d1$T3;3miBvl2N:6"[UY2k{'z:/,zfдLہ h k{ŗD{~E+tdRJ3`C؎β&OM%+=|7=QzP1h|ʞTDE)+Eim-k/̦R&obU̷K&Nx$]\E.,C:zvٛ"(ȇ:Ԙ 2RZǛg '.)u<=:Y^zT/<~̫MY=|'/i칏{/>_E=_=KP"l G,HR+:lq_KDc_'b*%F;ɿ|2GuY+|LQFIz%eBhJc!d *,ꂙ aEWL LNe0L il3q%]b//V+ ׏?ݱ-ϛFjGL\r+M8OE:;E)g$hCMf!r>kitB6f ֟K] lf/$eĆȗw+:Vb*QNL5Fx1M׾[z&Bn)V D6eLFy9 ãM;=ZJJF&:/؆m͹:r|g9Yrc2N,r"s 28 R% Jdl`s(z6$pmh4 Mmoϲ9>+A;VnB[E[֓٨;Jn-߰'6VoxyGmndw-;nN } W=_4!m=_jy|WoԹxmVen#WӾԷO㯛nxs-mvt<>{m͟6w`|OǗyk\psKYJu|::]B R7gYυ&yяA IcE6WR)!% %"2+U!iX381mQ3gl,mM,y>O>_olq{GOwF."I9 bTշ7V0gE;P``BGA:i< 8, D嘲a}ĉSVIx$#x&JEZ&(I&5֭Ѭ &` t%Er8[Dq&;+d& nnq~fvp[ӆ9ԭ._:#ɅE rJYa@ˁWaS,Ł(Ygl fG'b)`T "YsqRk+XRY y O߻xcz0CX@KRu*UoH:J/ɟ9=*[dFL7Q&<zZlmG P]B})jm\l/d#ī/ ~g/9ˈ2[P6Q@{2_Vy)5D\L|N^ [3o+͐I:<'gz_b͖߳{7%giRr|ۍO o~#?>Õ+0?\Y7i:#hpS޳wM^l_j;X!7{V |r{x=6@ kp^KhsI>`My}рu#Q[}PꦧemK}aԣ6[i6uk/^pr?Vl0 To3꿝 F^oZJe|ԯM.>.j lH1l~y~Cԝ\^u d&" e/;᯶nOfŨ Y!@eUa c",3n8r5Mj}җ-Ce]|iqB0͍nn # 6d}V2Dq09L 6ۨ4dD45٤X̩_SMx?{'ˊHV*AQEpARsy,u Y 6$YCOЫ30nON~d͍+{b=٫QZ_􅓿@pX_3jS03{E a:ꮟmKEd^|*g\yϟD_ E=[>/kL` T ]IRR%Ψ;AtlJ iGQ@(WuL$ӎ@_DQxz8&;y۸=l5?WMRT0n8$'e0iXBh4BM RJDpQy"4DP=|8oL++'i ɕS79xmФbK̤mvDXEh܀ZM*4Vu΀m:L!2=BMl!ODv*4qvd;GC8'c5tF4!cӎW z k%rI79*r=^rk7y|@k-kVeG<.yo&tC S 6Yq/'/'k_cAE[Y*+l\LEzB'sq 괁)htW2m*<]nIrBO6<-edf>س c5 `f^ ,yڦi'&śJ"[F"z*3ĉ:3g}ݒ;khU"PWPܭ箳7!z[׳O[8+z̷0*pA\gD9樂϶_ " p觿rH[|&kB]آ"G*9Й=bySfbOz{;KhG|lB"-.ŗ-vוjNDeӔsYNЋ+ dzi0Qeܳ^k7낟ږG%'zP7Yb>ČV> 6<Tr|SA<&: 惡+kPJ:JPG16DW+,1] ֒}t%(;+HCjp8u%p9Z"JPY]}t1AV ] \ևBWs+Afu]ҕtEm(/[o~ܻ0\Vrt)>fsѿ)w/?pt:߼b4O09B# 4&x&qGwPj4 dдusNiZPI(; `dTt%p:٫JjnK >CZ:\KárϞxoeK& aҍAT^u#pɉ1> ޻ IgӟzӉoΞmhn-,,竳[39mevy5Jzȷ{|s0Z..PY&-ۗY(C]|mK>Q x1S7>%`=aTT j~İ xhJyd^}$^e#[6P1Co䂩L.GʍqTwBX<6}`ZvG3F5grշIniϥku< 69wWmg֫]}0fbcV*NMף_z:l)4j̞01*UeҬ٧.nJ{U]*c&ɤnF6R:X7%ZZj  9PlbLFQdzs&ENE &Fcel q,zg.5k|gEsdbyF65zIAKwA̘Ʈ#D34f"{Β- 9rz'};"jRͥVGG@Rlh7h+cA4P:iʹc0 Cw!0*6N$\_кGQ"QW*GtI*!C* /; ,sbrf|}>5ysIVn=3TC))*f[u΁z<FϚbK$]DDB\RHqu~0Z#M„ڈ֒<%XrxQ$C_ZDH,6ppsiCԨ|"GfҺXL)&XU} T0V\S>Em5Rܔ 3Hf7U.IGuQe_-xLu;a%GxHz7 l]`)ZQ[bdbm>)h-i0mLWI @ds Xe]ė `ȳ-cMmVW&":)ZdCck(v^1lsM;n0o\F+(JƬC5'׳BE(zj]nGe@ԓl2vX7O VVokҹeU#RɂRc :0 s Y8Z\J}S SPNj*$_Xp6PLF LMJ+&T-j3JlA m :Zo9Ҕ`X;FY40Lb1X[|bD)pPg,:Yy\L5gq QL_ fUPЦb83+E@(ƍEWPvґʃ#-d~GB[GSm (ĩU6:A2bB ;֊h 'ՌP3{t--Ac:Z!9)YC+9n8XTЙ$ fjٕ ?AjD,wVq般c]õf5BeB A36%Vc Ck@$gH'Y gMGh&P:DxSg6R[r魪]7},e * @ /^m&*a+)0ĵUМ\1ƹ{ym_f}pv\KVZMl@Հ c;w&9hLi#l n;@(JfY:5-ZS{Ihd057#IȟG7M*3bO:V r <%*`Cr)j~@'d;[fҤD+1T*h)!@K̨  B)wnz + VÛXQWd]OG{v蓛kn( +UA 6eBE=FHcs>JXlU;p`\SMi.bjdМƳl )ܬR+kҪU3| \3i4t FqFb\Ρ]<3:?9Ыlc?j o\A+w6Uftڠ@8XA Vr:-[SZp4EOhIu!5\?Д neQ葡hScLEi8 8܆kW̤ՠ[ 1xxpk6s&TS.\N(`PrAuʃf $\ ^$CL.Xfꢀ%![ U\ ynvA{gkp ;- 8Kn鴯v|o}ctJ d_@q݄r_{Մ岀,@כI.m7ۮ:ܛ-H?r$y]U> rtUȭY? pOzOzZ8r8z{wtuz#Ӯ M;Ogk K<9~!枷m~U{Uzbw,ǯ9ס| o?|EEiX@7jTP\l}sK|ydزO_WG=f.W"O$r>pe@4?gdWt1-B T xfQu)\e%˞gWVN!}e31 p^]ݷla}hau5RvNO6vlDM1|ow;yu:{9˲Ю]jʢ.O{+U/xsBZ C|uIQm? Y)di"U8y6T[brʽ;SY%EsN eY{<٫b$w)l8tLOr*'TظS>ĊгAMgSy5aMy6էo .V%Pȥ@ NhXOR7򺇜lM4s L}+%{?!#dW9zt\\kLSs"dGN?KɤV9?yNo>i׻r~I.$^]~US|{.z8]ȳ+7+kWkۆ\i65$ u#b.> og=p`׍F)RfQV,$IQfQUQU *Ρ84|66Y^ n/7&ҧ20 nZd#Jy3ŝ|OwG,\lUZ qE" a1+]XCt" A,8^El&6D|cɷčFaH=l)iN&M, ]9Ҵ<G"ceV,i)rڮ-eaH.Kzhrwo:LX:_;EA/ hA#AF4 hA#AF4 hA#AF4 hA#AF4 hA#AF4 hA#A.,n_ \C^ AТe#[AKJs&lkϪY]lÒDêi#mW&˅t'"ug:i _F=5L)`طu~)[!Y'^hϹ"$I(d&@B""p!dhL# G&@89\~v۔%K#ny3zl/\|/u7| 2bU*C̻m3hIS$) x"+R]iIH 8=P 5dk;Wn\wtIx9ª`z`LylϏ_+=/vDiksFw'6$Kɉ! -^k1  ,;+DA^>˂2s/uq݀{i^P]kɺ6Y6 ";NK_^[n|-WYX+ )[fVq/3ۛq}&-1eps=$!0d2#NWz|'R݉iz1x $u" }6',+ɪ*<. ʕ3zcOCys"݇y8ۧ0( &~>תR~inLGj с4{\)'Y &vhi #/j`p;ʗK~_j0_zXrYjO_՛DK&K =$8E%%99+|׶nN84'Ex!FK (u ڤJxdwu2xV̧T[=L\LqtUN?)OuOg5ʫlԞ\ӭI)}簄P9Bܹŷ2#+2?Sw7Ccp<ϟ-vPCmլ#|sƫ;JBz$콒tw!z/>XDi<|<J>nb8vsZ^W]dUUQ"oږ͌#( M"~JO`2xऔ-'U0&~l89 ?|ן~,}ןo?Ο RƟl/O2'/-iKFsiS|*]{p0rvVkk787a&͛(-vʦ9Ʊ0̆i~1o?8EgXm.bB }ucxckt7ؐ'u^%j/[M8pb-XDXFHYa,*G&J_yM7}>c G@vyȍ)bFdL 2.'Dq R̐;U6v4|]8ڧ;wa}<Bo]lD}=v{{<2^W#1MjuWk e>o= \ߢwE \XCyHK9iq_C"Ko7(ӈx9side`)֚PA Ϫ y @i*HsxC4SBQ_TRKUI"N5sv3ONK`b1h=NնM>b煁/-Qֲeg7[R9Dge+ѩL&B`A3>䠍e4[i.an:~+;(Tʁv&7w;͆o(Fgo`FQZf6f^ooB muV!ё9kIVdV:LtÁ un5Ԓ]RBKkR಴OyY"&9CLy%TIn⮛;8i+iߓ:Y7wKzc=Vx ^(cpzwq5ç0.@ۨX!DPEm#RV5.e# 1o0/N-\ZcRl_S}LY(JM>j-1;,v&Sb8#DcΜH* X!fΞ[]C>Gsnv62r6.(\)K-y FɔHaBFeNsAA1* TʪUIʼnZ͜5f ai62i3_Eqrʅr!须MF#t19:Z~c;TI=?O{^yEm`G8OYBrb)O6qRapJQ E"!mDW5*zjbz8ʾ/|V_D=;]p_{ӆ٨/4Ez4lg_NV1p.!, )'N)׉{)/)HE\csóP=!xϊ|\PDF!83a*H|ЦǮfn'㸖b׮6:ڼ169L!u%QdZݥd^LTx4:-BU.x# "C* O3AA<Øea6&i*j]R9+XP]T'ϼa< 0sٲ8[sϫn`48cEYJ,-e'a&U4l yb;f#Tdf8ǎhMS(cNG%wtE[ecK1w/^ha꓅vfٰMo:!&fzfkH\ T6ip+aĪh>WEJyP҃#%A*:w1qRK)h1 TL*85<*=6;f#5O`!KVeY2NYH[rmY-sc%eR!Im4Dv_W^x=GƮZ{ػ̼ g=xeUF/\8juytb2n٧`gL$(mh.JJX-4/)o3أcyD?{Ƒ_!eĮ~ھEmMSb, #W=cȑ8Hi ۲9͚GU׻Ѐ?L4 2Ձ"- Ɯ7R#4Z"@$ &b:c;Ct/ɗ/@B(lFnO-[<[j>j~iWֶ{E"jcHJO$^\P6:aL1wAN IڣN&m6ŀHN;TzXqV[OuTxPK ɓKuv Wzt+ͭ͟I7皫'kPYj&Jzy%;Cy,?zn (\UV ׀j1*ԀrfL?$Bn(d+MV_USlm|O#,;k,~ ᥌Vsi@͆ij]W ߨl{Cpᐌ"o_"$?7垄j<-{DA|(zp;L`cQP=p(-ˤj" sjj\:w(+kL%n ktF Aoym^UFgOb4*(&d#9`7 "fPlʩ<%ʩ"HMݘ//&#TWx-8]/2H %IIq K7s.O[%r~YQtGj c2iiLu'rZqbfaʅϫ8ϔSMwDJA;bU5{gSGĮ2k]et:vr]e,+pD*W1h);vϮ2gW] Vݱ+DXJv4*FDl2dgW]ICߕ]eخ]CFgWK)zv ٕbduSk~UrE ޭ*bvЂo{Cw fbhռY}J߀>%H o?~5|tQMT}Pqp/׮#qR|LNӥ(xRGN,JW"9E:i^k~g\,?*4Ph8O Sl5\ LÝH;j Ϊ_> 8AqTeEs v+CoOLPsEP%xQ "ʭi6&wɏ4׻,RZX("TpOL3"5NE.~I۽Y$W&1 n, W6O9W;8i%HC WGet9M.co'BmQQ(mcre  Bt͚UƖCgW˵+gW]ècv92O(-m2dޞ騒ʷe Gc8{5A_'<oc&ÜWRu΢,^"ӫPM/ Q_TE[E4N%Ie[s ,h*a-N !Fg0QCW-<ðs+vu19g\5Y%>>߮D>l⼿MM݂2C8p)Y>ɒ]' btE pyV tN0)i,RTLᘕENp&{I\{pin{EQB@H#IE/ $8H3djVw8I2b: *ÔqrDtDs) ;Cg{⸿*Q[<l5'c28Ӏ04Z>n׵Wl萝]nSY"^b*(#D8H҆k|4'^i$EmcH"G ]i㹳jǘ2D*<1`ڙV:Ce"HO)C)*R1AmF87={J aυr*F[M| % mPq4jcjA PQAt9Tdm24De[ 谰|4]UEcӘӀ0қ1m(!)ASJ@=͢eU{ٴ^\m)$m<A}qMy<hؚ9,1򕓍e_ oz 㲔We {m '+ɮENcoa%~u.]/ai26xzWߥO~xݺ}wυ~k{;+_K,=.E- r_ X>;6.3zn;X'/'7M5Wnh~>x7yX)YLRBLA-m/9ۻ`+?DPߴo9y ].C[GKj:wT2/JJgr>:m1;3_ VL!G9 -f_q"^",^˃!-- --{& r$n͛ۓQwws$.M2ϙ 1;L%ի@ 6Dk$>L 8u\ uj6R@k"ԧ e"Xr'ZZnURiˇT/@G($ub(+QH/ƾN1V^=x1@[BX$XȽ Ձ"- Ɯ7R#4ZmCH Mt,vNKY_Lg/__8P7w&[vnVK {5?_iյ{2=]ǭĄɵWN uQo&I8%$i:@ B"9P@cF-*~[mMrf|?)ZPR%C-1$Ol1]^&yjr_?;Zz*F0 :EKn+z4K Z'~Kܝ b'QF %k?y 2 :% 1Ƕ#.ƣW1w~S[Q<94!9q,%gJ!9˝RZ`E;]VN?ȴyi$Omꂹmoz[pkovܲ v~i6Wwk,wySYzuIϺ02BPpQSԅ!R1rtJTM{2BoZ}ۗi´I@88M X^sdt$U+Dwܩb1J&i0Q\y7p&Q MfpR9*;6tNK79Ijx3~NFM?n4U2fL6h,]kh~y ץ5xYddT0fo'yWqIg܈ZC3 bj`{;0MM͖:?0 qŝ~#iN%LReLP(QU=Mءd^/#m&;y2LDDld׳z2 7o<]T}nׁF vG"8`V(74F^ LTzvd>9t1r(G ƣ`{' ԰FK#Q2,!1JIc#3i=(K`W{jzar6X9LFWD]^'`zǷ |76E~XdDbd;?>Sz>>Mw|>J*EνAU,J] Z&fP棎)Zaí='3-kxмؼ_݆O- JCypS,O۟ivV~Tg-]̼mٕ{qL?]4Mis' +˚ɴtTjLV2B‹^@M?'ouHy?>3D yLt= {p~6pz?*qEQ,R@rV+,,/^V6̋>(a=|v#<['Mծ VMbZ:la%啺v7zY巿+aCQ5q>jWcz:U%tPǢ֯EWK.<)/TA/Ow_zt:1QOc[=JP,€!0`5Cl(zqQzlvNP Fc;ڮ =;\,tIѮ,UL1oca`X/R'\tP d/cWbk!&z6Vg.ڋ+E>'*EJ@"y5 o#1l, |25t8'\]y7/aNp0`ف5 PHLtfj;y:wG:0 8Vڦ`\HkQ*JNܥֈ)(7j<[֑aYqGÀ*R.k%%RH"3Fv&v?KlC <4:P EZ[!'u2׀b>DP.ধnry9ұY IRR4QeW:ؚ~Eo!erYT,I‡] mËàO$ubgJ:J-,[xiܭB)I'pi ƟRiy(Zbˊ]M'b-}<݄/V,î2LJX:y]k{%xD_%az@@맨ώ-*4\5Sކ`8e`0*@!e$(}ކ|􌕢}G.qyp`Wx>{4|<(F1=ikÈeBKg#>0E! &Q*\.%6!`򙄉c*SeZb"GgZs1eϏ)B<8ٰeYVZ4װ+΂ sI| yEah0ؑh4 #LY.Tn9$,j ѰTxGDJIE$s2JaQiƝV* 8@' Ll: +%Cֆ(E+vaVXɦVjt-i q!:^%h/csCmM]Y1)Si%2FGs.!RKM72X :+M/Hyi!huBbGZkbUMH0[2D84T}v]ZI*5.%%y\H'"ga^L>DC &֗0qSۗ 榘#IyA yZuQt=KF V` [gD0°'WLzԝ,\ڋ/'9(˓pE@xU#0pgҰI D LȝEaWC" `Gy!] 푕,HkS*R2yp';Œ!ǝ:˃b8(吡L 4v@ m541%{%Q;Y.v*uV6JRgr|tNX NЄjw=Y:`ZȜA&Fn6l>, Pga>5F`RG43Z q [ 11e=o{;Ueŷvqʖee#)T\CU}J#68R$#Z{1[,kѶ!P\ ~rPTs _n6)Z &tK['}(IQ~&ԂȬm髵ma<4nuA~A>v\\L-jݶYhͶƏ 'T16NDOb,Pː<ʥbK0\Ck?|93 5`Ё FDOAsۆ؆yf)'r:bDsqc>lw^ER2 mA8yӳc"{&CDtX{))qJ7DŜ(ar'rF ЉqŰ|=ޤY ^9\(:zmw_|Q3OJ~CDt`Qw-͍#98,ql^&fo3|d-/lAV,fGuB3A2>9.xR -" A8&Ny `R:9Aw uwӦtھdOd ދ }yNc\Ϳ!XglڼG lJaR7Wht p71)-P`/ٟչs8 `Aࣶ8fJO`Ye F$L2;7Zz<}9z>ᮛ's \Rcc9m6OPDPHEuwmF__4!E5[Ƥ ЎUʬFk;Rsi3> vD}(4`c̀p{}Қ\y.M#gs,ߟKVy}% &%MN0A0i {CfK"ihj3J8IH.9m3(rU P\dj[j췌J=[Xkej +7bG{[}+ȸ{eOǰLٲ/hl61Y>t=椝|JK3K 22r9cqY6<'.'V!3rY ,ӞRU{ؐ\LȐ+Kd`.PI39`ClN&ƱXjP*[D5Y"Z6*kĄZcM;2Bi A!Za"V;Θ|cJrR>eJ`31ڢn%ȓ&"b* n-b5q$._Hvqt5XgUr]ԕdwWc5 UJ40L@DE:(YBĥ!Ovvq,8f` _ksvi#f{im$)DُN5EQ ]6?h-ծOW+fF).4ѨBVr1eAN\LZ/2*eø\ 1IiZ+VIj.f^ǎ  ~D1ce}g7omzzw -_ò.'ttt$L0QF)TZj$&+ %YnժI O^0|* Q{6Gv)ٙl:G"oٺgiKz w9xx߉n~o~u 4wp6 +F3"% 7gC[ĕ\{ZWH5~B^iUÌnf:G{ZMX< S!5dyd]N #o_ub J7QӠZqSw 'Ÿ9.#qjp]#0"XgJ]$$$AyJi9$Un=Xb3;VD|&c},3oc'z9eNȂW]&{g7疁QۡE$7Q3Q^OfSMR#T(,KA$/Ē9Jp:r]dNPX4pe< !ĺ$#`Y*KoM|R1࢔L%Ibٳ\F8/!Ȥ)ϼ0c%O9xELIZ 1h]hy,*6iJ<}uYN3m]LaH[":aK !}Hϙb:s{h\RWh?ҸtL|rx % A)#dGzfEFN 7bұR }l]{x>垢>U";AS8D@W7enRhWFf9E& %ydScu`lT9znlD.8](3 !V[BZhBS7%Zdb&%B\c:yv (Ǭ9&ACvɂSj3vm҂WyJ n+[-F {wWxaCu} `E% f«'«V2fj 隶.呴9!WmB(q2V6[X2 ~NcGceL8S ,ф˂tYK}ʉ'2X*=Zũw|npВw| ۛϔ yr܄o5Ś߯<y^2n|5S\Ĝ-96ȇP:s9y|c6Ns`8SLk]#N .HJ80;|B٘}ͣ}w`,hs4v4})1 @=Cwq? Se9yjP˨_ A',n[ݞ&~teCWn v}1|uP6<5?A3pr" dP ADdq2Y*fe_=`8Ė^bN 3H^gԤN3E*ʓWQF[e} -ztv( 'ԛ@Qm&p8\Jx/ ]rKYxBu e"iT z`Fp"%nI1j z2o7}w'gI .RN|{m-qVzgoZM_kz'w~Ixyys-,:zY.1|dpU q$}r$f irϋVj~avdzT?.p1X\-hӰZ'u0 *I{F?/гFfЈ}ϯoXZVzfe"\{̂&I%cw+O/%q;f/pY[Mަͻf-ts讹(.%[^ͷzCZKh>v6+0tns/֮m띳oɵ-Û3mK%oQ[p%SnJ uM_|RX L bcMqhXOݠsKsAsMo=}|<@NA76xW1q+$S{P=+WN]ٹvwZY nσY~}dYԺ5GI5W.9j!9TLƀB+wmH_6W0 vgb Xt|$'bG,ۑi[N:@lլfW},VdbR5ƐjUfǼ) ߣHl`Kȧ DP) H?.LIh] M*DrTX s;ߡ0՞qN/Q`j$TRkY;[0c83`"\Ma]و(rY&Nhd.XbvE)dHG@ȨU|"bQ!YIz9՝PTP B[ú'rDZĠY3qnig]} l5{}`K1 >U9j-GDE BdH:U|Ju[hi_-p_&$I| WZCG~Biѥ(vTx_~^sjZ׃Ţ ㆇDi- f%,FU3A04spZk[.K{Zޝ̀o3Lk'J: +s^ u{ܑ 6Ftb EaNX~ѳYWo"jl|Tɯ!'6Xj>3u^xf$ۍ` cM֕vbhj!f$Ǜ %]:J Pz"QNV C#8molz)JG5j1 M;Ơa(ٙltERL&[s6۔**bKڗI0g1N:c˞K*>j m@&6LwumHm c6uE1&B( mb *YJ,f.WPx#];} ?,x#<dAp>D}<;ORQ+?>T% :nn(\ S%MNrbc}rS,YK&i yè@ !˜"Ut&a-bX`]@]biʮI1j5ׇ[c 7S[BFjݳ+@1b^{]㱆2r[5;o@A0;oaj 7Qv$T:BWxTr45˄ޢmzknTmJ6;AH]I .Yp&xBm0GfbSȐmi!i;]Fy/&5^}CuOqQ>Ѩ.qCoh2uu`cB~ro4mM~n1y;90ٰYe*qFk"[y^Օ[m.yLz O??jr b݅mZ…0u?PGz %Touvb|)` ɴϰH^?_<[lԯx3ygO wE͑L?ME{ TOt7qzIZȒ~ҮݤpМ ͒VxDq8k{=J^_U"|bl]t["|l`Qavg|LME?7JZCՋZ-3?Nf } ^h_tL̈[α7bm}<9E}7=yϸ<-I<_VO/~u1/#C I }@W0JG)F2R̭UUD~1d&!Ĺ]VZޥ+He9Q1茊N))@@*!-DJdy%MҲFOB5X)m m2uPs}n^4Mk8x~ܩM 䩧œ̧ߝme0h+IhMšW)lTB̰)B޸\BR9R}ZVZ[Z#c3qnFv\Ҍmm+]XzօV槕oȫ8ٯZܼlqls/.x|a|4MLT=f4^k>fNH-"v`3U+0S1PUu*m[{]:{13 C Bv*dWطc("AxĹ$s1wiǶP{`iP)`ZP,غē%EbP 0AFm c]J`/59&@HyL@L,jruf܌S*RqW~la@M*T. u()Ibra5 EB|SD 5:`ߘ hT8rBt<"{L_td[s3"~xClUrҒmq4E3​צD&CWMb&%m,`$#Ci\q%. 6ӎm6CPyyQM#oupeEE} !X󁯍WQ͈lTvK=ڮu:솎fʮ^Cl ێ&E֞T|5Lc>=c`+ LHJUm`U*t 9ZAGg'8hC M1$H6HS쥣" jdOA c`Q.R4&CjkHLڑP|P(J_2Hk(XJz9Zm6bU'X1b.ʘF'iM~5zƥ~oO9-Q:=Tr7,`9CS Mdoe9Ə^Ta.":3kS.3Sa˽x?~&^*F:5xaOzvPRH~{ޡlFiqo6g<ޣqV-ꖲ]8eZ;A"̫Rֽ󷓘)hBq.@%:Oyuz>᭮&WьvR)~cś]3=8RmtᙖWiFeqU偗€/?&e|tfѷ:"VO>7q1M^Z2ll -ojFmn`c35bIQ܃H~2\uͣuVCnjuӳZ)YA+ VLtUFuF|_b3JefG{_߿|?w?>ϗ_}ǣΟ5u&ﻊ˝|M٪inoմp}˧^v-ڵ#ohsQ-ހmMG3ыWkO]t:5+>$+,6"3l[R39x{!#q0G&, WGic[urdAI[hCR1dKѡE>mKGgVv{wfIޣK, J7&td)Kl>;Ev,Tf:TfyW5 ^50=N vKn|'G8_g$+!ͻ?rP-tB:uB̥c#œ Yܾ|i'Ts=$S{)Q@>@BFYt>대LRL> dߠv_Azx<ݹ4󩾯lHL 1 /eʣSvP]*vV uZVDƒUKNȀ ̉leN$3f_»bLu>֒EK2鄔tPŜQYꘀWA\F+kE0IeԪD[k&Rӓ݊1z}bj'U֍Νb ΋vvCoЛײs K%+r&G[#b04F}(<Eƺkt0&ƻdo_q4R߆x/a%q*~^poසHMߟZ*޿|3 r-1b6EC ѳt|쒅FNK588 3vAee ]"XA)5ש$@mnʑmŹ#],;_}G#ï}zn!48.Gjwmv8*ЋO# R<ŽU-yŽUZvc҈7Xj}e1, tq .dg(Rh tbo/(Cǯ$BAڞ zCC 7tD@L'a(<r6 鞌Jk v+ n2rNvL0w|Ȓ#I<߯ݒe[eeKNcEcUHh5RT?j cZrQh:3\B4D0!HL:z4h~wDAq v#CI9~{~kN {ә`e3fo uZ<ѻәJӈU\!n"WHT|_ UR:\! \Sk+U6U<\t~7fE /b:[qѳX(0/f}kqDT-#p pRZSqVlܖGVcw71B[g5I*3 fǖ@]Z%Rj"כ +'wR]az]anhxJ.֝z T#?)Bru^r*SURCBєIAs2Z ;WE8h0$]!B轁LWH-]L%k@P,r/p2Bp•`h:W_t*'̗[^^:IbQBo|܄T9#\Qɡ760،UQC0Π҅{NּJ('y \-.C>Sg{qa#>Ak}wxVJ uAtFiPBjO9ׇMJ)h6CDMצkS)xm 6^B!~o/#S&,ڔ)U:2ZȮ2J#IkM8d{hЋCLTTThaI6`)B*SlsSQvgU.c} f_l{ vs&2` ܙ!N UăCMhSh34S.=0Ӄ.ޓnlyc[ߪY,X@f_ܸ+gaUEW[4S>Fv=vnx~^9w7ߐƃ˾W!uʫfZ%T˳sZu/G~e|L~T?vmA_^r Y,hǚex{IX-|ߝ/1㾻PytE\k҉"7])r_tX@/ a>cp}.4s32= ܰdyv{;X{kxWeP-6<9.U܌Q'{|)[n k/M{q*dHĵ\H8Iy͘44FFiRR!yLRD5RBr#uzuΐc?X$IR)؄ 426F؜6b!k Y[)Eo,/nn6"t?;(l7aj uxA9@d\E.X%IF=wY'aʞ 12lr'!(fJۡ^F&J7؍s?bA?̨ڍqǺFmޢv+.x&S)1p u4I6Ww6wh O0#gEi;\Y)g_eJ|=Z5}~q7U 1*gr3 %nOrj(kxG>FG !*8UsK=K@ĝĭTȅ:{$ .G9DVy]2xddJ!xP)*)MBχ(D-v J. P O=t= {uZI-^Xo,֕TsS 7Rde#dB2W[ 234KqX[* 4T$C4j^Kb!98!yO$ SFUYoANv!5Qs 9rIr!ThdtNjGa>"LDe6M1Zn/TG沉/[DN3Z&i)#"IT;Cy1.?5:5LQI޻; |R&k3!ը daP rxp)v4wu8#ƞ){q-A5lBytO`L W8 ]Rf'bIE86Pr*W [I=J8/'+쥂c9F~uMpEL K+rZ_}Tg3#Y~-ogĀWI̹^[nvPr\Zl: ,hIPk^K狚aX,|bPb`ż@w:VYꤓZ]WInud_òj+uE=Auq𣔲%T߽8Cv?8ysëɫ7N('u|+u#0.."* mD7pzӂ647ikXid_]v.FgTh~ _.ݑXWw':]wЕ:a Ju|ޡJk9ŸB4 eQ1H7ӝ,,`Qלè@-/yoM JrVy[ oǶ8L0(ϟhueUh4Ш'dSMpˣr&CA1MB{(//v% 1X3 73{x[Lsևi|P+`rk{Bמq:&\N k¡; $Rp]ؐ\&)T d[{_~C$AZ5  47  F8PPNtO3{3Ӭ wF{_fJZO$i1e:) Ю_q:NMH[ʊPҺ pQ|J6lNCظg׶•ap(F@r`)4ȹ%o sAy9OLm@!j3l-ᚳ˃Iď : _ϛ+Dxd4eYhJGO$U !Ab<5!$+NBakP|2 l[환 ԓF贑w`egV[i+R=H%$#|*RWp<0:UUE?<K&F ϗU˃x}L8]K< 6#o\~u.q3r=f ]j'fT#i()#eܢ_IigD^n/5gAwZ{ňMLu ^٫1hd[iD+:)9l01RQCMg41 BIXJyprSڥ:kƻOG{!?^'-ڸ́/f`.z}t_e R-SG `"H#ZjЌ$- Vn"d^WҒydo!B[}e; 4g<_F(r|bgj_L-teA_IsHb޵5qcҥMvUflk*^xo2cdHʶfj(6"A- #S}&E#8WV(Ade ٓcxF2=dy^hd.( 1#)Qa4)P@ byRnu'p\<굞ˬw,F;433a">S8hFey9cٴ(nF+eZtl#GmD=*(ʝ va#/0IFt>Kބ@"6vQi(%qfu@+B(2B g-N0JI>@VO=-!O`)qmOxjJ8eUt~5ݎ{BÐK thz _$W.LK}n^Df] 8'4duHLAuHߩ@a1x?5,eO\?|g|a5:|QFUZ7C=4NWg FoiChwXKn]Ų4ﯪrijAQ*fV0sD)hv`(RzͥS޺xGgh}5nvka˫k4Xn# N*͞%8sW%dhx2j[?K\ fn,08S&E aR]q˞D91T5Dk [PTcAohG^!^c^cucn= uG?\뼓2$o&$c؋9ѓwY$5m70f !.{ 5{}ͽCtKn'OM@~ix`pPE`Pjxl ~i-&jZuH݆=fGNDD-[ ڌ-.6;["6p˸4@74XD!p$y!l Fˆ 9FtzdG?;C3\߀92jXB`I(QȨZ)ttbdFr"x:v% te ^>.x LqˀY,C2`AFZ 7ִS2㺼RчǼH=Ey}e B,KнO cՀqK*;i_+mPrۚV{h7fS:L'ҳvo=V0#*)I- DL&zGe~hGZq M`[L_YWzgq``qg<D)+TS.)Ubi#)n[aiĩ}la#j}V s<~뵇ٌ&~u<Jgf%_L{Y'*i.6=N\uLkkyIz?Nfoڦu}IzT %t_߆fI;q]"uRs02˵&stu#gM:>Tɷq$8զNcһD53*unhmE.7đ%ʄ ѡP5nhwwC][G3H5f^n|>nzAb1-/weTd{Kv5Lol:N8,>BDFp T*.Cb:DP 6om$\.%6!`BLª1jvdQRM0B~Pso]FN Ch*?XK9r誆C i%\)M ӭ^M|IBNWFVbGdTaXK"2]T>BH%1QMKc5ߪ@EzU F91Ub1.7GtJD2"2aaBtt,&l]{ 9E@NuE[R͠M繉,BgoY!O.q\6þSFxu)4%[i &ROXSw6>͝8v=ĝh PiS0>hCJMPK.#&yM`QbFx )/YFt4D{S(s! ZDl ha[aFhNPZb歑ll/|Lhqv3nahsI{ѧZRH“|^ox0~SESL*XqU5xZbgOXBXGJ&$-AbSNart c>#&C B78[<]ZN350%(k{0g&'Φ 4An0VMf]H9JkH8cI|Xi3rE A< <|_ ³e|ַ1z>(.r5U1"yX|#"hH8TrM$]f,.ɖQ7uRArd̔F%$d,UjsLz#2e1?1ȘPF qr:xSOD*MA贑IH=?.C0HRVaX1c2b03ش^KN(wNo'a?r ~B{~.U 8mdu feQ,RR1h" L|?tN^5+|PRvS>xNCuu7xךsjkb³y[LtIOm,U~]+/'*ЍvvڧEɑRn'im|/a⇷o^*Zza؜ v2'*}aM~g}%tuge/f8Ɔ90?axLLhP[kLh[R;h~fya ybqRWu?}7"?gIzsHm.@^xv̓cK8'=mŴ>bX|&`RPi,*-`5Cg:p>"`)}Zbj!HE4hb '+᩵%D#8 :m#bEAki(xDe 1q!ZI"aY1Ȱ{@!Jj`閡p   Znv}LOOyo?V4;w]1;xVH" Ōa!@jĤ ;Qop10i6,RDӇ%#C/gy3L^Ey9qTwm23XGELvg]`wOYr$yg[?er+= ,CQLʸh3LĬЈcbII.9ˁ3u!NG^6+:SuOtl|s̃7i@0`A-)B`&znX26g#d0.`]V<`lQSSWmJ\y Ҵ,*^>< H- umfKYLJɸq}4j?PlCl1xE2}:0߲fv|RGna<Vfh(l  O8M:J4IH傴.E \g@'Ta Yx_FFȢc<י8^v/ҝoƤ'+QB9>pUg% %` vn+ycGn;>q0P> R.t9m$o[m٧Pp Wp}9CH?'oL9e.1ѯOmNnQS4}4W쐭ڑ0HP/$_zvyj 8m2|s(uH2B&uYHi* L !ܰr&^氺 ,cD9MDC2z`dgQK'_2Xk4h)0XaEk/A;myߦw( &'J[)Y/6`Ў |h`9 g] ¸$q6YY([[~]jngjm=9xV=1x<-xIg%E6V[ )W_N>sdO精 ;-\ Qp-D{x&)Z:VDH ȧƐ֕w_G_>V $0mHN6^ hOɪXcJ+?g粼`QX8Oq,j#Jy d3#$t0?{51"%0#,Q  ]:\pRXL t^TVb=r7]zd4&>*0ep"@o#D63a,˸VZ2ǚK&|S^@y g6yu@%b͕ے el _A=3.[lI\Uiv"HW͍T݆@Bx8h`Z ABLSd8m8a.JSkB8+}~p4Uo9k)=emôШnQؠ&p|D\{ws>s/GGqШ9~(DyxOwW30·7^^m ?74o |37ֻ!5 o)Lz7:s'vfpp4&?cd̅r ʦHB'PH>OgGt*其[壭k 6,v=:JLWҿ2#PRX'vQp4my?'7N=%;}&Op钱]k֕ N;JVFy%:ָ͍8]!>CFOpG !knH4V ׸ $b@Up* ޺"ܢ"܊=ǔAsl"hezg)A;)Xb؀Y2;]Ŕ\JZea۔.}KmY39rIJPvۙ8/涫xwm*3nɆ/ќ}|>AaQq #SVd eΔʤeɳW\Eթ&`hsMmH*I?+^~ɉCRrB*Wpбv& ^`xZJηګ3=YKu:.ah+)뻛̂tr&Fm9O*ٖxKܼ\X2:G,<1)S`Жd:i'Ԛ#6(YTvc ژH&`ş1[/(&gp$~-Ĕ䤺sgt Ҍu}a4JT_Q_mrZ͈ӋDqzCYh& t[ Bg.DrF&n"ӁW|p9MC:fmuetvwgpm܅Gk/5'& Z /TYiJt䑁t٧HeeLj3 =ݓUۿԴ!זl(=ͬ' b% #h & 2xtՐ5ud$&&'eSL=$*Y\t3A- {DR+7zC?cq)(iOݦOqj(>P4WR|%#Rhg-Ziȓͳucąݫ=s322_lxmIzׂɒ)冂b3Noc 8a׍wrǻoٵwEBxY9X #pRWtJh7Ӏ67;8/ZS0?,F硋)\oLt;q#fk=ѫW{㓽j$Z̋ro&ϿkJ"ob>\qX FP+}ݛ^Ҍkʏק\g7/9vtp8[i/RTm9t{`lIK-e͈,m沲#Ci0l}`2@/<:>9ĥ vwN.kuY_5gqY`Hs) +R Gh0 G'o~m?~osa_Ӭ $]6= 5 )?~9z8Ru>.ڮEYy!̏ G^=/7Bu#q2]X, z5Ƹ3F&\̷ (EG8HfLCʡvct[u8}^Ӛϗm:l9AY1Y<]N2A $}Dt2:FQN' ;49=ٖtb͂13]mo#9r+|KG,sEl"6,|㑴=s =,{Rܲ[6;㱚*VSŧF8s6#0Lm$6Ю׍N$51eEpf a@AP D)O%Ϥh #6(NeZx]ІHBbd=yg  e0¡ZDC?zc^I;qt ֻ3"؍?В=UikmnrwX #ň'CGaf/88 ,!$G{^]WC$@+D#8#2JkM>mJsW{˳~@pOԠg|p{ySss}yz9\|x^pP˄0o55`B%pqs \ є1qW,3GyH_cDqKm8O(+s?W׎HbJjB( \ ]XkL(ESĩHq cxPLO)YQ['}JTFilJ[DLD4QqndnU ᘾ8UO.xn z#B9[)gL$xApʪlՏ}4^Eǚ? R@ОVp_ɁF2#57f򫘿Sӈ2z\6D?|{jo' ?Nji]Pԣe(/_3]I#h\yh q^BiՄџU鲏tCzI׋|XffבQK}7<>}P>D>8|ÀӃ.t.ߙRȝNj/w 2b+Q51|!2ݢe|(cjwIlJ.[lg ob'6͖.>aN!suepF,T'5=Ocs(tsIwt=}lyfCZi'~j-,}OPQh6)um U 9Ŏ^jN5v5[🋟w]_޸~}aM1ʏ.fT2oPuԯ@CQS=`GGnM?y'lzXu7<Fߠ_n o;*bܽh=lqIw1T7uʷg`_X6;8uO| \>i*(7g:aw°MtQiQ1+{@F7ϝa;۬!?6ߎcKkɃJE0%6L$FAd †3aYz)$Ft7둥~ח9P9\o Y@JP,9HJSyHYcV$Ϲ>%nHŒv:3ЧjTG@@:hX\8z[u$Lv5Bx! 8碼hT+;ZW+8qD KIřR&)5Vh0NAQ`q8#G$tG2<-B4$zpDqP:)!ѩwIK%2-7ΕC!8Oi߬WJ?cD܅Vc{'M$e 2+ D%%F)`nQ+Xo(>}d[wo) \ˍٜv|^5A:'4V)*Zqr* '8Dawݞ) 7!]!=Qt^u4r5s QHbigg˄?##rqg;lmO ')76H>2;H 1jMxNG4:4hac00JL(uNsC$$:tGTPF4q 9> >-e$y&4J l@) +|j9m+vt@EG*#tsVdh 6ʺAGиM3eT"tQE[EkcE~{,@Jʑ=|p kb^Ngm0nVL%G[Z12Z061%cD[ȸ#܌pˋX%XI\e3"c6j!P (Mx1)[D$44b&^ Dp7 e"X\[nD% L>$])rtܼ)޹qv/G!3NS>{/Rdp0>=S UE '/qjS^m=/~R#$+.QRQ},c&XBθր?L4 B>UpQ\˂1g| `QR_up6Мt)5<E}e[ìOwM|Y׉ 1$%3kꔣF'L)F"pJH}2+aY C7i o5a?)ZPR%C/1$O,Csx?^:^-שٮR~_}p}_Q<˗,fYt˃bfRQ__jƆTA»5Yιw~Ol?$LWORXR}r\SWYŧ+3O?Ӵ'Mi'i&q]ԚIPEB:[wݟZ ?KysD袭m#^{lMrmF oObpkZ0ikoWL6W7̓\,|4m8$ٛ`ۇ [ fUqg˫I8fя\e5;sŕ!2Z Q4W|SOL*A2s.*kFos͕\98g0{j0ڎD- bޠRj;o܏ S#wbuuٺъ @ 7Oip4Uo 8=Uh-߷@4*̤<35gcQ[7YKVbvoL{_9um )>o/reΑI=zM0k>}%+ NZ=bI0Sc sLa)|90ǔRc Ea)190ǔVc sȗ֌`o9-1ma)1903Rg0[c -190Sc sLa)1ų<ڳJ>gysܬCx:6+ A2FNs<s|;"'Bӂ kt(dt!o\` zf#6X|\^o2,?܉W3&:7MOc>l/x ZHGg<gpKE,(%ZjP)sZ$- VE%-$!vP 7hB[}ekm+GEbRh 3b>l7T9V$d'^Ųǒe: sDYbRdQPAheMTlwI$UNUc`h&RI܍J?.l\kswa߱_}yqMa/ Rt&K24I8)1,d)iRErH}zڐD>d13!E"@t yiLtVDQy O1Oޟtp=3rxW6A*H AUa 2ϼ(2X}3?l'cޤ=6{(hA*'ԃHQkn H<\F|W^&:Lk'o֊̌|0Y .'T6"B] (xC P*gi攌C3a۱Ň,nw{E*^*\ط7. ե4̫A r쓧;` ,:9`qFvd6Yqr|yQ!gp<_WQlpF1_ktٍZcW?j@bNWJʟF?\/.j_?8x+?7vȻ O>tlY)B~7/눱]#,cnݑCZr^:YEzp@GR:z`o!CGawtZmk0徆6Q!º B?^[Q=žk]UFhUh/G՟WQЌBo>T qг[_Z9}65{ۥw{ݐ.n?DyY9vT&] ^OXڝYu=f|G]1l>\A'F# Oix㔫e}'j6\>L=d^Cܝ2yw3O7\h}>93@ ? 1~|F\D}]#?4`i^b[hD=,F'CR4qHBQ5\a1LK]&jR5Ŋ"#fGtyb~6h И&g:uJ)iDOPMPw(DQ.I! U t>^#ѯk3ݸ*\}ywM?\Ak(ېj2y V+CعUfa= iߜ^mcU'$yNIE'v (ؒ-RIjEA3! VJRs +5Z^~ -75o8w 3*>LTPƫ䢏Iji8fUr8=B2q>ąZZ£?Qd_>hΎڳZqiN!* $CHT(:%[wѨt)gٗRܔ \̇$bag5[r !H&|D[l&~Y1v#<ǝL_&ĒāUA'6aUvZ*jl.?{0Kt Iq!La*)U%Mtur$\-Ddu1" )ef{IY`ggYS!H)T _$7MQ9 !YQzP*(-Od<61vL߾3ݣ5CL TLOD(E $!Nۇ)C%fL |Z 3dL L1$Q|@G)[1~.E#Ŵ#ZͨVOz^vᩎ&׊2TI." K &1DLUbҀ.18N*,Njx 2Mk'(DiM`wB;91H"ڭlW `~˓ o"jl|Cj=x(0 [RȀ +k6ݼbѢ̣ylضJILM4ԬZZ։5ireݲ=aO=':"QCQ6 C3Xm}'x9_M..kr<.tf7C)D1D)&5?utp4@cl1< xI0g1NsUR6eTd%lm@Am1mam c612&9r!HK@d%dVϤRr5Y'Y4sn)?5}{| s<^(5 <%8'= 癨O}S2[u/:5ffP.I0V! 7F˱ 2`=5MNbcbП;(Yڳ@ L>AF#3s YrT SPdIx"eQXsX`=ЧD`ݓq9&{4:n8`Mm 񽔓պcwb> "(ߜb~?XV'NLӫ #Ϳ X%@.b, c@`TG$(s X牟d3X;!NuRaD BL)C|GI @Q|Ȏ0)tR^e@, N{ܗ!E(%8RL5T1l&΁GS@?m$ EH#$$.Y b-HZJ& B.(l-B5nd`TYf$6{Ԓ9ƆL>JRi2'LO^z|%.~o0_ܛkKܨvO_)(MQy8\ȶJZ#앲Aj E5w,!X!ʆ9"J7'_CfqBR%b]Z,VZ R8Wi4X,lwl- Pg[2+]||9]qN矦WDckm,I@655[v`3U+0S1@*u&a^, ))*lj벅d8#/&!XH")kJ泲b j7ӎcQ[7Fm=`xeHe)Ȃ.gȶU?G~Չqq)cyRZiɱhpq{8D6XJ) *2z `crKFqm׸x*xL;Cqx6t7G_pUEx@\яLF?֯0>?VP5)iR~~`{ ߽Z宯|-x(&~;g{]2Z5'O+e58)KN2+4KV{%̌՝ut8x{f_ @௕ ]fajWk[1♪Æ5]?כGݷHhwn:ؼ ;Wroџ~?ZUZDz8aN^XGWGhM5nz{૷L*X{3l n`5R(mcEca-{oTܐ*DBoog_"NDDk̸&ۑ}\]:0ZÛmJ[~S6o`눷T*_~U-֗^x̪Ⱥ*6wuv[-fmZq >% VˆBA Z](LJbNrtŀζE_ZH5&RL^x+.09w>pIst0S\_(ὼZ}hq}M&|M@g8_E<; =& W2epW萾_$ޣS-zב 1d.ϥcHf=tiV^D''ӳC9̼O_)r^MM\SqS3~4J]vMqtM/gمu:a 0gKUrۚDmŴɯlYΚ*хt= Y`Ua]IgE~J˥BI!l^['utٚhDu82PIo8Q k D{g)p8U Jn9DCÏ BqRQ$+ai"w<(g*43$tuv^Qw&$]>Fc fU.1݇!tq/"Tv+Շ8Ϳv %M$pVօú6ۈE>ƺv p~IߺKr;C@i7ROD%6hx|amPɨ`eA5J0Ɓǫ K6&+cuuup_l͏."fE*>a6p 3B˲߆N.Y{u_5ؼ~>WX4[o͢㏾ o_o kո}#cm&}},=YZR !Һ4(7^?c>cvn2g.W0 E+CglmclJ~}[H_6Q JOi5 )hbtRUQyS!NRTĕAH&0Km3u3%sqSP)b%'N{38vH'G 1᧛mϏpiXZ r%Xqٵz}S>A*ltڤ EBadp*Q;gAHck#w:;X#)f'&\GVhΣ*_;wӤ6]TBKkG2DpV "D&J<a U[k4O%Ɨ|]ynnw4lPחgM0~~~S׳nvk𝩷'SrmԛG_KSR~]h'#HUn1DE4./Qs:eח-)AB@X \Ձ L´LT$ϕ,y+ R)DNa~p?lNH/tl8`0q||^}`>l>-Į[99vѾ# R7C~Ο e=8 ϫ ΟWwzJ41j-{R*@ȧ.t6h F_'Y.8 R=˄ݚqm~ ½;7]6~>Z߽b*VYRYT -Cggay5`UUwX^|U|*nݕQDv5QaO铬Z'R3YuOXZ;%*Z$^\ ay;>#oUk;FE?ͮ5oKiE pݸr_KUQ)-;вuZ.˧>i|/J~plPSy:AO?iNf0[`׌M~e/XMzjvY׮ڮ4mhGgl--RY،)Ko{;|˟uv#[zЍFv[ ZC V޳63uvl5uN7[ ]4:Gj0TZަC!5ZѪƇ Fεp0K WEt׭++[tҎ ]5^#F?o#;,f36C+DWM+]鎮:"B´Tm+DY QcZ"BҶm+Dkl Pqvtu8t9mb*BWv_[6C)yGWHWTZDW0B]!ZśNWҰ$W6]\ ъl RlbXWVƿD467nK'5F{B I.-][B~ fpn MZDiQN:hphmY +B*0Ub{Bttute%MoD0ek ě!ZxB]#+Л+MKbh#6BoJIFWf2]u4AO5ۖڽ\FBWv_\nRU-2˥\zf͓_ʑKo&)6eepJɕ\zHDCW].݋US6M3ykL5+Y[L5DDVuj6lu{VضWPtBC+,i]`Jdk 2Bw!JE;:@`k[CWPB4HWJ U +ZCWW5#oOt(-JsIY ֣TPp誝LY ^kzޣa['ڬ]4M&WhL.Hl$ HN0 鲑5<FxI|m A=/nmq- Z˛ZεpiՆ li{!\ޚDQ=!ҕ5\p"Jղ-thm_JLGWCo{(j!Ύ6~3zOQ|W+]َ:ԂfҴ.g-th%i:]!J::@b`[ "Etp k ]ZAOWˎ03-+o ]!\BWVPtB$arowph ]!ZNW+X)`ؿ:FPm8bBӏj]5}VGSy ePkHuKgO|}"D>+='>M `d$*|QH_\C]_5qvt=zBhۣ/> i|&8?:_)tRRYE{zr̯UR.xf\sr< u u ,VF*BI Lo}"ZJRxcPj Lswzvy[!׉3)4%'`!\! Ro0(uL@!;3Gq, Ȱ ցD59LIӺQȉ|z38kYW>5+u6AK@!81) RbՎ hQFY2Eᅧ ny;UOEgDOY)Ȱ@ EA# u*PG oau(03SOMޭkUޏd3ԲUGQ(z>4[3+?/=wzWt.[#X(K,o *dK3': qL|K(_SLo2tN&kUdWkϳAɺH7m3MsMuTiU$0 .v6:Ȅ {]t Jmwv1ϧzۙU؎͔o 7ˆP!JXEFP<1-7$a絯]c0r?:ژiW48(''VExsAH^>YX秠*ױ5QuwmfmjeҿT /#@g ^A nP4e#/Na"n0ϱ8>f%p2Ppڟ~-~XʼwXg@ALGO0ٻ6r$W{۶( ;X`X`w,{$َbKe[m+2I["Xb됞%bghK GՏҗgq6!x6hcEL3?ޕF ~61dGu`KWD<g>Ȇ NI0dt#4Yha~YD7~)_' 'kR>;PdȍGC)6m0`}ͦjl1MC Byv֍B7X`Yᙰtl w,l`K$Q' ^xN*E6@*XV*luT,rBژE #dQUiTs*>|R1ZsLޡe4Ku;cIIVZOJwA6Koћ6#~'zcعr٦&ux{c;G؞-\IZv-[;wȥb|ie^K\h`"֖$wKt΄LZI笑9u.\~'K.f{W Ҳ?vbZセӌH'gmKg4]\??B6ϦK؟xx/UOor:4AjLJ&Z!JQ{Q*=rC'(.fZFmMN)[[h 2=mʤH<.!2R-c5qJ=[XK3P;7. 2_l݀O1h=?]WnсcJc *.e`D9CX,Q\bK+30lJm!g1hl$ZB9 [Y]`]Ok dᐓdVMJU!2@,:J4$MB@a&H6>!eX$R>J8OÖ/cΊaXM?P"Bo{с! )g1{Ezl9p}06aU"2YmZDg9cqcT 9M>e2 r3!hۤn$I|b,n-b5q6\>AzՁYL:`lkUezdzUP K -R!2Z3(tND.򩷋biǶPWz;{vQSz:vܨ*o!#ꉃ{amAp} E?*U,*)6ٸQcOvӔ54`nU1J[*$ӟ軓GU|ǻmro*s$+ Ld %!3N mb:2Kq KaSP¿wT^ Qˌz AB& 9Lى T9i ; 9`"ru6䕫k}d27&L1@S0MY?ke9f~޳C&HMH_ P:nˬ&YM[ɷ[t%iKYfwv S6сr<pKd8e88gdžFxS< 錌\"!<ҬѨy,Df,W nM4 I-Ϲ~w3:9 xz?ܷd1J4-x^kg燫5z-i5p6č&_5% ob_ZqR'oT9` iq`~{q[7+?\N]g s?y4<>tc !Ut1k=oOfpG8hIq-v5#lfUY~p|(2e%?=msx:=9c:[Vj3u8q9GJÊOgX}1%|ly`衜GI'~?l8_ ~/x?y?7? }ˏoh c9ƯK!-@g}M+^ijoٴIӶfArG`o%xHw.>XaGl]5 *zi>p:GRTFxZb Cl~(Pyc<ʑI|oU*xRd вUb6E%V M6KzZuNZǞ'}i5N=?+CDgy F8(!_& 1q)tJ6aYuLqvq/ytb< Y .ɷ?oݝOCO!t YН q@Fhge6i6a[Iq|LNԈAq5RЀwq>20w[F~z{i-AH1{ei[ %l)BV'7|/QF}6F_GUXYhI?\l{2H0'A$0Xj˓JR_Hח/RZח|wn'p=Alh.6Jr rhBL& qƲFrѓþW\9&IzpprVCR;\#rt]RⲪ}g< uibշkzS6u~8S|I:WI"wv,sq 9B ^msPJlr>\2hh@рdqFAW0JEa}cHOh )!7Z]k ?cW6]  2ޛh<-29cfp Z_v")@{N|aq>vQڧ[/h:;,n>ϖ=raX]W||y_{k9ӎOO\ =JQBra#׆yYm ,y[J3`{/Xx+[-"7bL3.ʼnRD́Ir< zL1%0PO%pܥRDB+/hn'`(*jlR-z @(\q+'=G[v?njhN/;3lmnF 4޶j+U9u$s}t*K$/N忧1RV|9Ҹ.O4(g/ 3P1$=Hi]n^0$b8I&uxt~{==4,sؐ 8Ifr*)rFtZȔ Ȓ ѐ2tB$=<0*{gFz<0s h+cHC'X& @(t1Zߚw2ogVֻb?  Qmnw H ?BȖR ^imtM‡4M)i2F9,YXFdyPNA)qښx{8/\N6a/e9ٍFrXKEj. " 4EƔTS:e>^l. e?ٟy:8`j rHɵ.(UzH{˚ Ս&ocɬ%K퀇Zjk{#,+a86xe(i\unν6MI"qɍ6Bx-Ynd"5}r{(<=C:[f[dSvxֆ*a|UpNrj@Y?=LM EVzQMp^~h1Jq*$JYX>bDY=U|/de*TZTjY餁p_JXc6@l11`A9UVʵpe)!#X/:p:[Hh ޚdQfg[>̯R7`j"YH,gE+M,,6 D3M Ln<d& u@G 8=ݙMN\%J90v,|مzr=l?8;3 rh 6EmwRĠ?!t{beҺZn1~ 77ywzۆoq}Z6;;uXĶe܋q:Or~DMdޜotqyGljx= \=RJ+`բmWL=p H`Us"-BH)yW8ך3"gW$ap.pUpUԢWWB0%lχ]ԹUVȮUcPe 8*s.pE]"=\BBH` c qEZiWEJ{zpV\(`W1HS{2"%pJڿ,~0JHntc:q|nZYcxᬉ#x7|`:/FMNY~,Co}s#H8']Xٳ"9VI*"%ɤ\{~3j\ffn*9m|Y7j#\1p$d9Ⱥ`gMF7-Gb*#=b(mu?7G=Zdi,x2bW)|\R*T%/|z=G= \%m_7CRVрT?EFj`ax}MjC/ ;ozUyFF:`00[ymidZZoO3գv1KGSկ#鞘Z+6J+tVU\.Q3k׬~fO2v$ۨ% qPXY- p'Lcz=jtTL)dU#u2,OUn+A-F R>2*uwE9{OC"D ĵ}A܇3H.u@ 9qYP" UF'JNIևã " !\W& O[,'K'@5qvgF)ڥ hJ4_$pz:T[mgjz|)> a KI>ϸ'gi"Zt3 (g%dbiڽ< "rCfeH,Kh7Ifpw$e'4\Y,I4+U-7Ik˄-:KQ'-Ykg^Ԭ^m .w.pJ+NkK*yE0v!CNI&< ){aEyaK@F!踱IgC8mhvxgi{ Qoϫ6o=wJ- 1ȣ.LaJiFXPi6$?@pHب@<48e[KY1 iήv`֚'{Bez'n_ )n6-4ցg"WEk@eaw^@e=ql!1:qY+ELٷV)b#dM>D}z*O. )ƈ/VrEWN 2YB_T Yc=miC֡҆ #ڗC L&Dx:8PE!mXY8ɋĔ !,H]A.@v-E+knH}hC~ʞ#'&bw,(8ZFWvuf֗Yy;xwlיOmBNbe|!)\nR be#VJ 5Qь(iLSz62]@U=B1Cb;Mw̖Mps 9sZf@cT3 "ha)r=_Ԥ,MD"64B{KKZ{<U{Ϸ9[6^hC `lxkspE!iTd|9(ĝx9x1CXT.ƬqAh)x45.ο{S8gyD kI$e*Q *;Owj=2H2 Ke(bS"@)3 NX6O5D-Hz鍎P+ހa s(K *?Ҵ X eYgR؜2Cm#[/cHA%SKDF'-2A#(&H,`)ds&\ZoSܭ2+s`a\4 2]Cx/^abV|] xNq^|0I+|?8EEƗ!)q?=t5ϴ>YKɓL}M(.4g\#(N&86.ѩ2/bhe }:ǮƤrx ^znʧM@Oszt/_;巿??X{Op'Ҫ &Z֑ IIH ~~hk M͇Nm3Գ b\ck>q`o5hbmmH_>47vףlܮdUz0Mb~+fhZC% wOxF&N{[xߪ`{5>l% (06H)x`*1\+h6I;z.|N,;ѹĂT(y!Di  81uTby5JwyIpU]˄droqۖr6廠'-;m-vG^,(w5hW;JFZ|z?@{\\/\ۿݢk@51J"8<2xSރa$$3u䖀R.j_ircpiWE.Hë>:>bd@3&t]Y iElQw4ɭ_׽`x=Xh (y `I=;кg{G'WFz?QH[%6}oitŶCZ/Dcd!WRP{ Vw-%N ;kʣͧџ9Y+I0q̸FaM"43a':))kYi7|cg 4 ޥBIOĤ!':SO6ti:_&Cz_u5l}4j0IMQ[Ì' 0zP$9&-wVf趾l}=丯`k ਜ਼;Ol4ףאY7wnlږ+|^9..tx 0ߤƍIOUp}0k) SQĤŴTDQ 21Ϝ'OΡJtE 3'C'%-{dYbg{D4z)I*Edm4yD ^GS:VV#g˴GU!Uɴw AOz[>%Ńp!gЂ 1k%$A\2gZgbP2јه,y* : **V3x! >NϚH<*Z: c.Ujlѓtֱ'X(n3U5>1EWmٵ3aM&yTx/I粶W7zN笽f䡪VޢC}% G&%fs7h`niI`%t2[U3g3@qHL.9m!( U JPz!LJfFfXӅ8cW]XK.ܫ.\PTmrU 9|3NxO/hB?'_Ftd=BNJM6&D piQFF&g,&'UF$cY4c/)E MFE 6mGvqʣ&',YWC-r6k8y(Zw쪵ee-;!8^1gJGơF9IcUa iiσU>ؐ\LȐ+DKd G,H!k`jvjlևQ?9+P4b5U#QuӈW%˚,1a֘&&`v!Z`"V;ΘlcJr>%X%r3brm]JdI|d19[.!Hzq̄|ոdW+EN/>X2TI02LJ+.tXXICĥ:x(wMe}hvӇ{Paߌ'xѮ[9m.7vQA[uя/9W\dt ?7mi#0cT\6s6VGT]>_P(2ތbHhf0rQi?V0K"`Td.A N"wLb. /KaSP挗V9ݭZ vPP]RmƀL( ً(t*]"dFeJȸLD5x&y|`ߍbƒy v&˘0)fr](hm O'Fj4*kbB*J.:+F4 I0-p)T1'i O,{Ĭ"`>fۨE\49eg"Q~tr?O7K ;oUzR೰,9!Y^s"-BZ#CKF`cݺXn5Ñ2YxR5 T59%!Ac2_hyzW4ҁfB!Թ3xαN3m]L8Et–.!FrΜp6@Y!zQʝef|3H%c8a1m[s#*BvHά.MDǬddl$*emq_9#7dάC[13s;Pjduw`IjOǓqnRhFFroQbP< HٵWpv؞͝b[2Ppv%Õ>Ȥd2jP`Nf˕Ls` 9ܕjE<Co2Bj> JCL khɵskET/R]pt\\wTn~" GK.דp0v0me«J{|tFKլmĽ㳌koP( ϥ{qƢ5:mX:{0V0eͣO1 Vh YRF p/9Ȥ`Uaҩ_Iܯ J+ӌ>\s HЄټ+g9H咍FFdD J0]"LE2zsnG.c d%BNvjl8+p86-^|}&bmiQvEgYe²,Lb E寘#(o?bg)-|~isNG~ WB|t 'C(-ˮF4r|?mO˒# Օ-?,׶V{HVe"Gi(R~d^44@V29 4zpBoK!ۻoڸ hrj+C 74mz3n;ϽOմ|Hmg)I0 7lw=_5I;?j?ڸxbEx6'#<`scDW5> *8ݿ}曟/F>6();d;tLakøJzWukﺿz WBṯqk0_,5$.Cn_[WwꍗiX=Mh9D&wjz|`eZW{c^G׺z?qrnz +\ tݓw(4u%qi?:4~+-ľb6⯍5k=zȭZCzW9wRdil\j? );xA;I5˛hjH}rK,!GFΓY8mڢxdFB?,u& fE^pe Mw;>9>{#=,zJ#գxB LYmeE{Rkgga*˷&Þ^Bb5o*1)6ǔgEqb9) j .DM$#1%i$JܑKM\+ϡwL +) ^Oy5B {g;5܁w#ەV jcq*6uΗ"7BbI)oK d 홧*Ƿj)q2ssdHQH:SѩTT^'U$ǔT0(tRR[YmkQ֛Eή8Tg!P9E}Gva#>_īvsg_5،u* 2Gż[.*2V$1:LV{ߜƻnyuy}0~\.o9,LJqqM*ɏFr?;<,Og9"8,>^Lq_`?`JU#ZX;2;+o/&GH?Us}=(٠[G5kxC - Zgpf6KfN+פ[s==K^Oɳ㮎_ׂvus;Looב̎+g+nOmtR0qA*# []͙ǻ'u u,~yig'=.eB:cgU0ix9>uw_;&f!p=9pɒ &(KJ%l϶섐NWox?wt|zELdD R6c%%ϑ-͏loe![3,Yˏ0\4-4x^-ƛ4A}(V-Jxv3v5cW:` ^o{ 6+^ZEsAf(5ۮ h hAcg` BWfNWq{ztŕ]U ]Uzg誢[OW{zt%]bg|W誢v (=]1w/CtU/64]ִ+tFm+tJi*CtwNwl;]%HW]+]U ]UZm;]U7IWF'Er@Q+`h~Fshc>Y{00u?033N.MfsŹ I#nj)qkg.b`kv(:3^eE+ޫ(8$M[ùޥ=;VvwΛ6lW hz(R`CW("CtU+j]<*Jk.=?`ƐNW\2]mWjJoD7Biho@W|OW]zUk-v+tUѾiP7HW\s[M!PӿdܔL' !xYo0h2I^@iu=hj\F_Ƨ^Zqh0Oi^NчCJ[ZepRl$B2nxۜE*Fy:=nނP*.NgP2h4VZ}~y]:\W'ԏ3e¿KNͯykrɰJOo?sO_y?j%n= EKu\7pJ+nl)v5"|4XJqUPEU;rT k'~Ĵtv qRӼss?|\}"޶Meu"ۿ?hpsi^lɤH^QtpR 2)jU#y!񑵦>-x{s{Nȇj>4_Kj7 '8p׼x9*:9m X4,8Op< :(L Iq&))gKI6 \0)$`>1W\B1 ^ 2D}֎ldT+M{/Z,1n `地J"p$e&a$g ȗNmםh1V8'䦂\ߤԫ6?wR)JЦ(FČL ȤD'IXq: wX8iןFKI$edٺLdt < aNg}Sǐ)p3F!ksmbJ\@xt09PT`_g@03ZOXS+K1f"sfsa-#g\M'@x`hD9#IO?搙5GQ DMd<%]J: i,Yk>ovBU ˡH͓>X2LIڬgY$WEk <4}>0:RΎg!NEo=tf{5| %2߅!T8'XCQs4/%$E!T)s q]$QI% DDv1,o)b%gZqa5_I.z/!EL/, ufQ0!֑A}&Ť mɒ8ChLkDXvL'Xil aK\GGkAO-| ~Ui=tL^^<ȪA:x`,wx XE%DrH*I0ϊ$$waBKlEKIO +Q= gF49~^f6XXØt%a@ZѯG/Kv9@J7]6ǐ`?gxSN{,(! V,rLx*+tEK"g켵5n@xpA;ZJ^ ae8p.kM(41jIA0)ie#JȺ*tAk ,p48Q`%'8L6"Lt(e NN G㋅@& <-a0C4N#؄v1>y DU׶mW7FL:@S+Lwтf1:JBcAc˿Rٝz4VSӄ Z\yg1e]B=vY: kOfsepaP"Gxp*qFiJ?b6G>4 .nisBI%f. =Ҁ n(=fixb <"vdq PaD~VHalW8ۀ"9%MQq)r% EKk0.`2+$[}PI{Sj=%Rp81bV- DW+/s3^.9wO񯛏x+o 曽\}?!rsȶ7_ᒤ/?o눷3v kuʥKL}~|ww^涻wc^8] 0ŷ~}q{y}s{j׆x? L}l_?ƗGtY*cW+Ut>I;,]$&bŴ+L&(VOAމ Y Xmj~Uf*qs[\m΁jƔ+W6] aӟ;LV &9f1H4cU=FLos1p nSF18X>}x< WER\}9 Olpi|C`5N[8S:[26*pWmz듋QB$W,7)~%bW+ĕCb \,W,D)~Xx\J_W+l \\ r][̚ W.H /nKY Xm *+DW,W:CUڤZ!"HX.&;"9빶pj}jW2: qmx쮑Mc]sʥ6B<[ܔ}tC b0r4 ŰJLDzEm$Ts%6`ቅ>vcNiA\ZeVrjKUqiTU7{´6^eתamדz:jrbRl+X 3ʽzŚb'.(IڸaH:f{w)㊞tib6~=Rk'Uc1T+R\=mB2(WkM+Vbb>+V+k`JwY XkWujy\ʨZ#|` \RՔb+VS븂hjJ>  .$W+Vmb(V+*.$I'/'d)Kf|t*j!AsdW b֮XRZTfq~yccZ(5ʕ]7u{!jslzfً4:CR0j}8X% 1]/6;C96pO+V̺#*>1!*-P'BNm^|Xzn[[MX\ApQ X')bѶ+VٚHqu\߿Q(W,w:%+g1QqB\y띑+&^ XmlW2+ֈ`3֮X0ɉXnLRpq*Q\WdsY 8/W,׋mU&]j_#XA hf:XedpB$i9kWRL+Vu\J*qlܻ8xKW_ t0cDzcYl/Zn>KsYګx[ ފޖ/(KU?:#Te}mxj@ZEL%ݻD[VN)O-egLrmXncZA_Y`NqexKYҩ>=BQbɷ+VzĦO#>_*9/}Jn1 o3שu m3שlAUR\=)GA,WvTfZ!\p>Ab r%)by\PW+ĕd -bpr+Vlb9V\W!R \M]ܔ.uGJkWk!VIp1bK]j:{}Vq\E&A`k\\k.SWkU ݖђkȦp};G1:hxnLY!99;,׋eKSuaΔM+l1<ҚZ!JXAbA]>K|*U~bK8/~ެNnYxGFm4vrRu*CcW \esƒX}+\Amu\֬݊%c+NNtr>Q66]ʢqw  .r+bpj}lWj{(+Ή˥,WqW+e ]ܥKש%:Xe*f ) Y1bQLtjKKPMQ\W),+YչR Td$cs;ٛ.P! TuӰxTN逪ˍ+UU泌WdQ:tQ.eSM :viS|,V?* r1[A&urVKaIWl$aIΡc[ jR0թ ^q<˥)14PJ0u BT \sޖ+\AV Xm *Q\W޸ \Aprpr.uNe*>!W7]zڼU*dp"S3pł\$&b+ FqF\E܊'+W,7')~t*S\WɆ \A0'W,׊Yjg.+Vt} ޕIYrKc9{Ё|)$Lg IYn*R0 UzbX#3&Kې&#W4*)ʙ53M\U f-urYJ~Reh|qu4u)"W1Fb9HF[tZ!"6]Ap6r+kI XU6T\WRn9Xn)RpJbdWk9[z>x=@7on8 n!;< n^ ܼ~ eYK_Yo~73[Ӏ7Sn>*V¿)?w>~WױF/sxwm>+Avk6vh[NUAwmw; W>kzwzFG.=wo&ķXCl@ӿoѱ7.m{Ӄv]o#W}-&e6_3 %Z~nԒR[nXX=fEփ*Ʌ}qg0a{S,^vGK_ VKԊ'*9pgk ՘+ y=\%j9?us[4W1L_ ¯\%r }-*QO>pTRyܿEs%\Ղ7g[]zeJvGt4({5f:+k1ӉZ} @>M3]usuo}K[ .ffF1kT aYΠF[pl_ &|j惬 oZPQeM&JKiQJ' ][hUNtVa?'iT"F}5OI-&?n-v v/ t:RA!_-) -iHa^|R}(͑=?ނE)b_ [[\L|za>Mbm4q5mwce9.E6ӖWM陬h:_FY{=ksQ_v=_XXTE>,j$Kqi/UiٖTQû4mQ僞?f͠Bbf{a5ߵ]5S05#fnkm݊17v.{?wv.tB߄]`#YFMxO$[" EUbYEa#zFETՅ_3Ror*MWe^)0AJ~ F{Eai]Tѕujf])Lᮬ6] f&JeI۴WXүVE1:XaŷDPDz{-]NvDL!+KMcmMCݴE*AnWd7B%V7 2"/Ku@P+M4BFvچt۝{˳ d " Sߏ]r̷֠p:ޥL*;rk9M_i!{]Oag3;`Zdr+)L |\&ᜅyR3g`mVXB:_"#zrnŷD98XQ众8 +P 0)F\5ҔmJr lO'ʂ2jdWNfW9v`qq!䫴Aw8Ĥref?* AwCw l[6c]'&'>2n姳ϗsţka74YWތ0I,>|0VR]ƬV]Ċf>Xֳ1.;DA"@ǣv7|xZ~q;Z3b[5>ԜTUN|?͵",e8]nw˰#,>)~lF%M2 o:3%x`2vLEUuf"/hWn*L&RKFP̿(O݆0wl^p/znLza޻zXt<v۞0>L2~2W >$>'$Ո]m#WBB|лsB.{?~:n2bxUH{VN5ᢢiinl* S< _ ,,0,f4͟p)06Fd}nZiz6۴f[s_WHj'Kmy<_]߶ 9ۂ vPHU Txg@cꨰ3Ë` cH*~]GyQ{Tm]S{>!Ջ@*J1Rl2Y<+O>x*sC0,5"r7f05&6,ӥ^hCJMPK.#&yဦ|0Z(1B /MRNq,z̝lʽ3øaFhNPZbǡiWjE֟ήލ}NB9﹓n(]x6uS.kA^Id1QND.ND| Jw2;WvyWqj^;_x݄@ BeLH)-y&S9ustO#&K B7_KI'˜Rwi汦ې.&X`T9F$㑅HGꥦ4.\D;;#nrxizG? oD37}$_ss{9m[ۿ)Aib L9iT+-q&X!rt4؄n4*k@QAsgB>Q1,%! j@Cٯ[qWxشƺ|jMG}+΁ IZ}tYn`\u9R4X *|*F=>{Gs|;OsBqPp19-L=q Jp4AXF&׎##QHYuR;cƌL"^ˈ`˭,:Ɲs,m)C}ӷ!n,734,ܾt^ 8wIZʋlj]u/ ńɁ.Jgep=.dFC{4 d%v>yUxe5J5WỸ.wS~ﲊ^0^fhLVM<윭,{i W湝}͵tOn5mß7" H"FFtՍ"S`aV(JQ2tcx&MO9[xl42Niq`qHDȘ0FM)a 1Ci@ly:< 7Jrq=0^뉰z"iD#Ns9<1ֈ+A3*c gLO|QBYסEζ](PԵg7 R|^ˆ'^zXX}퉷xӍ<§k]ŜKDW㼿6r5MJ竾~JG,ܙѨ y8u#C~4n>W체5㸞BqqJf!B&}B10,^_|(U;Ӌu/匦[GĐjRNh$CǓbHoFp>nr /ONUGlt7”H$f`~T&:R¸8we5?_²(SE,]lF MPğѐnPy ކ|j@RmImVmmm>N4U+WS EM̳릊}M m\qeH<ަ+Ҷ[[Nci mN:ָ$Fh) ErZͥ=$y|Ϧ,ߵIX1if_&?xBVy%^r߻vf<ܸ{&X782 W>I_/^#G|$LbMEaZikYw,RL;+=2-TU8q<Հɼ2HG0yd΁ne/Ce߀9x8S5,b!҈b(A}Lj$1ƃɉGԍK]ioI+,0%y0^wϺӍA!)qMjm˃Y^![Z !-i dk}cdKLsշ-=Js=LMKMV(}TfMp46Z9::hRT+MW&9 㗥]ݬ>>Ρ6pڅCp) Ġ2d) i8i5k(癌L !T^-uW_~eUU;ٵةw-LdV+1TBjj]\3hؒ 9xn]VR[}z?R}Pkb'#R f *QI}V.("R"Of0ǘWd-J.%Ö9!@p^ɁT`5/2id82]KWUZ|7]~w-*/oPrLJ$uLޑ љGmbfYˤg[S4wkCU^1GG?T&KQ$E,Wʊ U^غ=^!v #P0&2',ԴEgeuh%NVֳjR: *A-OKۨ@NU l_=OKx <󌨐my}"D|$ T?_8IT@(֡!jCzzh\8 JV5g}I;Xٹm^lDeS$1"ElYDN1rtɌ" 0br_@ʺqTAqd)Ƀqia$!"bwe/#Qm v\eW h5s•z9Gb DFgg{у"U]Zi ~ ҃brJO1m,TO4'$nJm vs?!6&?ci(;d=&Ǜ0 Ӳъ'gc3NpTu+'V=e=L.{qհ:\2_o5=xv}zQ:4e?%c8e88cdžVxS< @][KH oQƀ̨Ȉyd6=xtXmsZA<"E8u}~^}f;;Y~299`z2`Jw$$jGoNhC68G:4< a. ;RĄqQ>GBO/fwݜ1GQW7iM9SNW:RV$>Ob|J >/7p.1..I&`8 ~뻣ۣ?;7G޾]O31l) H ?Њ[ --l3yb\mjqo`O3hqv $ף`KmU.P֣못5*UzJh LD7ܘQ<"B,q݈@m;ml$vj'%O4JlڶJLC&ѻĂ4hmށw뜴au~Hd!ȼgف/& 1q)tJ6aYTyWu07Rc;_}.;Ъ>tyBwcFh鼗BG㑮ʴ84Q-e? [7#꺫wᾶPEנjȃҖX B"}Y=]T9&g `U:s띍1_IḢ}^y+)} ;˻^m(=~]Ő+-oo7Bh.;!lGKkUk}\Q*[K)Ү,D0h{C*ޡ\疾_G F9Q*hmN2xG`p NL(d1tSKK)q߀yE +yR؎hO:nf9^9/]v[={&:;֧6K:.@8Obb-ܵYN z#U&J]9GU^T)pC 30$cc#bg"}J%ɒ)0uDUz12$Ժ#HLD44K t!6-}*;g{O>'S/!. sMfeLQԱ̙RL4!;% "t4 *[I+^~KE5B*i-pTlզEOIWǞh)NJO ?9߫Td`qU9X| )']N׆l$k^ $^Qtp$+cs<]7-MuiMBh%:NIEXQe 3UQ[S!&0AЃFL.1ٜӔJAyH UM~dUaa- +wSGʽoxý8عOOl8ݼZ??7@yAcJLAT\2L'. rTd2予-@P=BSJR!hHFd2vdg1hl浦s?b(]21IǮ-+QgwI9c–,i.kΑvǓia8$4!@Ô@YicY 9%MB@a&∥a*HF5_t܏Q9+/XM>vED#bwW%m,1a3F똽"vr8kK}06Dd3錰UYΘ6W3*p \p&mD#YD_']%2"V'H:.Nۥr퓯슋2.{\; Qd(@I"%"ˉLF}NX~ !L)z\| \<&⡮z7<|#I!_vHRf rNv 8l twUKHBұW=$%"%#y [9yjndJQ?q$H1ӏ{:hj?Olyzߟ%z??OJi[~Fo9~­gPڋ䂦6>nFhrmK#y=a7{Y2SQfLXGND@o2mfܿbIA#j)K9jr*.Y{UdIy!z/0G(//fPwۯ:h:;d~ | \}W>p\}WFYgN)ݎaGu "-S#8-Yb=e$I[ɷ[toR0iKYsͮNPۭJ$ ]d_,r*!WT41G趆ӗ3tM[}kB?oj!d]f=DAI0LI̪;ELЛ{SZ3PӆXu;I6PL.`M͹2bkT#*X=~r=ʃ?ִ!AAgvAQ}0"X/Xe .(os :'C֐I2#1 Tj(I$BZCRJ3gLB=lkmi%auڕ2fīe?&PP1Umȃg!ԯG5U#'4= BcbOr2t~o0~ScU>\ xO*_f9im}&8JdeB8'W\|'Hׇ);C:;"#<֬Ө,EE&yZ-Q$V\9mLrtr\F3t)7eDƒx0m7>#7?DoM/_j4Z|r7zUS3I}{'{+Nk͍-pܙ Ssv·裏.€/oFt:ܮRUmiy~{G7{No@5#Yʑb0j0kYVwlHG U;zГ1.ΦMι*Iu\vI Fo\?W9#y(.g{~?TJZ։}hv ]:嗟_}{Kw?|˻_xku"(d$tx۟7ȮCk |7׺.f]n Y[[׃vߝMritnzԇ-& .~f3`M#ZvO*gnqDօxuO>'.x#Eœdo5 a"% s޶AaˤrE=RtcyOz5ǠCK#3D $yE,@BP351&K(:<יU_Jt[xr^">}HO&]Uѝݩ1\ T~1MdS%^Ja*2}/lR76 Hǜjw ,xG8˷S3͢дAuqP' .hS1&D, yS]Eiy|)FY@BD !,VUSN!5f[ &IǼ.&vݴ_vx"qz/mw:o11 b%Vޔ6$u*4 Ul*qz tmzRǓg}o `T:Vt'}IR2Gl$c;KEی@WA庍5Q"$i0kU"ȘUy<~>q|ٯXB4,yQ9k:${sgYOoghK(KV`-s%2+i3}(y09kx;19z{ ;Gٴ'4\~#gzAw |;67lsZ0yҼCM(LhN7{1vnHBi Lm#?WUX/>͔^B,9g G¢F6] cCwK&|_^Ay ?c͝MA lG< ׈Y7w%|=_el?@=A=]:vRbV&HɌٙy%F}TbR %!G%t!Ő;O9&)V]€#DUy0 y$KQE W$M Kc0ɺl0Aݣ\u{p`=;PX"&*Y։"1)TB,Z 4REөZ!$-[@ L6U 8Ms 3q7T:g1R-ɶ;m1rh>sP탶T\J 5JhkZDBdM שgĒo=)v=}a Ia$@dj(XIu;qVvBձ/T/ܩ/ө~"5Og0IO/ M'5:P)6A:M>gf _td[[3q68|J|I37>:;Ӓm//~l,"5E*D(ȔX%7z_!\G~q~/3ڎݔYYɔ.=r: nFlܕqCp=~hDE ףO_hOm֐cNI.(M74|lZj89d7HtBPhht(!/RR\J H ]+ڳ8w^8W|bVc~c[wirÖ{ϟ>'x㇋|Y7jvsѾБҿU`/m7RUmrֽfտU.Pv_ 24y.3 RS&8M^1 J|Ҳ*`"Cۡ@ eW(pªۮJZvZMm4?h͌&<LdBTI:ڦd@%`jIZ;~ŨGIgOJwl=*Uu[e?Rt` hʶw"DAe 1neQF" J66]H9/gQM$T`gl7o0yuy~l:9/-j E9D` UKL$`u5uPJr[D Rh+feJڈ6–b!BzSxTP?<ޣwBY7f8%ڕ7k/`;^yWb,AbP頓sJ]LG_6pK^23eUT2Lgdv+Q iy`c=1PWmm^zh`S8PX:U8l'g796pE+"bhrvMe,EVZ\j61>*Ji%e#Kjn|pq$}#5~0ֈjDxp2 )02`e<*eu P-e"ƲgL3(1Qtf:XAQGa@XDv&ΆQq_~~pP޹g̅?v-BVnZ&޳yW*R1CXH٥Z8]oGW}JvV.xgfɧM`)qLW!Mh"Kbw^;cF'L):=d%fY iNo4Jc,EOxIg:EKbLxK ɓK5Fg}r< &׋v!ڧ+ZH%DÁۦMz_Yo)N#Rԅ!C*g"s|Ns'9ғԂF=%@ N{N Fy`^C5M6JG*Q*68y2593@SWVx+ r-i 6*6SY?k$)c298b:zǬdz]~7 qd#g7v>y1ɫ̼4r=?d~:5ou'OL\.Ӣw^Z9kΖ?ihigcwnԾ=Nsk}#k7|/I%YJ\׀ i[h*I F8 _NOUbf>lgx|{|bU^jRr`jc"*$u%,wڇx]S:1g&,՜&Tȏlz>-ߖϪc4+Ksݿ|ȉT b9Cim1RK$U-5UIhe+b !rƭbhJZ2MS4@h0'E; G1oKefÚ{J"U!Bh`$'2CM}k 'gRXrrJ𜥌infQ5.fMF:"*%Vc+Dሢ::Z3drH""sĵ3MnBzjUOy55zm|֣Ɖ"iQU\ĤVE9Hr m5O#'u{%nU#qydqRϤ<$&(‚gވB7ĠE0= sBE4ݰ4" p\>,K? :7KQgz|?`)Ɨ8"Tp F_r8 [9  e.ɵbNp:~?wd}':9f4Cb`e2d-FnOƻrKw~ֽO ~f17Yӭws;̙sbnl?N2/;-6ǟlЎ.Fܢ-b72hb<ȩ!6Ǵ*݃#7cXLUY )ݕg庉؂Z7l z޺w9$jM/b%^piGq0ܞS:f(FÅO3HB,PM5OӬTXƌ2ʍ[6OpȁW:c32ATe/j6h| ᘰyL>I%ԞaNly#%9)A{ )Hs3Sח &egWQiFe'(,.RֱF:acZ*()|12h qD H\)Ʉ?>ȇDfWDr&7$#xGZE&D/6g NKZ*ɕ4GL;`$yJVr,אrvy!W)%)GnpohMMZ zAjI\1Τ!9R %",3LG MtЬEN Ycl(g\}l.Qs3&"ȵvVrNh)FbIh`}Lފ@"[JQoy ,A$g8YLuT&7^9Kqv8kP1'SbIoMR/%S)QkĂDr;ĖxU(o'v Gs?~ob"X*t hA@o I*;" d{e5(NDQ,r >ɤ4" kXfq%T׎k@ k$`z3s!\MJq)ڵ2V<97= tCq6␢2/q^\HȹY@BwظҦ6l2{(NpLDžQi %(aY0\HY 𧓹i3VҘqv\ʚOW" f&Īܸ\PcAӘtڀADp;+tBznl $ TR$2A3) O94TEx1rs)*5]څWKNT[y#5$jCXF>ǁ>@4C_$;n*:`*zgN-i}69❢f}cwy(;p\;QD>ϼW2P48 h4D#ТFP j➆>%Hv?˨lMqS[o]]j6pVZȉl]ࣅG2Q)l9L.y~ꙧJy*`E٨L.碮2VLR:^ 3RW`s> ɥlzdj:uuڪ .tc}ytNCCĕWw_y3LC&z((-PqG=g,KVsr3䚳HmĕtD_֚QrN*,SfrDt/uԲC#UV]@ue4()H]p>*+鹨L>ҖH!-v3t+55j+GVW[ّGrr<$B]V]) \ʨ*E3qfKX|{o,r懃0qZ6$VXNY8dv{,VpXb9PW͟tUg!2\lRX $L¢1ZzYb2XCtLMB9j*i g yz| b{: 0ռ[r[<^/S9{;;>K2y(U4T?`IJI9!(¼ r?&I Z'`{*Az焊bGwzޜwb!ypf$]8F* Ǽ?vCQo#G+b@}|.rucU;ܳ?(Dϱz0VE8'AQ8{d/4븱{|A!\$*P0A'MdKKX <7o\7 4q]T7AtEpHmdipIcV ikKy;Nb>h KUUa86l Ҷc!? I:\TW^9ժtO󗛸t4&Jb'Lunb3q<ޡMӹmRw}=?˴>pQ*(=TXxa6r͂1 wfٻsZ՝>'Ԓ֚R R\6g<͙\Cm35+o;*k6ķ n|ޡ& .%E }?OMḕEN&xwXrra umwؚtfSz$qI  &A:|5V7zI2%(r^cpeJRQk"e R#+?2Rٽf7݇_cd߯zfHQDc=ǯb`پYN8>B8Nu"Ei .ȸ6V|R$Pj&R\&#e4wzq˨Bdj , m]V H')#)مV@x@ *y2Hi1DV`oơxX&ܻZrDH@j<5 <::r IxÌNɲ *@ErU2FKW.cPbR)7ϒ(1PE=$L[b;f\4DNƎHƦяX󖵯:?趚;Mh+clX v/fQ4|0O?T=s{Qo@lB- o"P |t9G> eC"ZD$*lL -b ݘa@Fq4R]*QǸ `] dp1PV2sĖFZ#gJ/?2dyn|^kތؿv!C“x1.# `3/ALW2PIڬ>uzx9߯Z01)SyaޗدJأ<"4h=)J `EL`(˒sg}[唺H8#X&A)L DDG8Dщm`U `@H[*ԁ/&ϤZtpUchryX͇OmΘj]ĥZBkgB9혇 *Ũ|2AIƣO&Q!bL8tzTY "`&w3"P^qCڎgi|'zr<ibWB-;! >EKn+:~9rB4",D}&%&CWP#lOq0Z񖭛uyGOc룈W \.׫ELt7Xsˮ=2uvۿWڡt<7"\&l.ɄԿROsZ}~yao T*.^jRrAHJSK5M缎 }t"u{ڇ.1-Qyg),5&fȏf>y;(w˗Vf_;>N؇ftq~x{5R=u{.4ǶXdz E5c(CuqRjX2/l)N1,AkrYΔrvJ𜥌iQ2EM0R LRq"FR8#S<(t~:Өz:4O` xc}4QDO,$h.c@G9&'y$긳gZZSWMy6G?p񊖛e*s:g}šwVxdx+_4"N4^4Pri?O?E>h= V77iwʊ┾4zqEO~D2k/MǗ3!s|F: O)5B;k~15P e"f3-W\( @'DHJeC'"uמ)6 n9-^]n$N7zb6ԛC݃tr=1ozUQ4 g7Tk\<25Цn7Npø{RSgo(vqNP=^=5}d>ƺMuY 1b8zd^]2 w0z`9HoP2{?': C8/BF]&. {LYgk?OOF4 wR~:mPS/rtĽ \'P^Sg3d͉=a ̠l/ /xz:?rъ87{;T"ݓx>aOqu'xyC +j0A(%'q"H_M~3쌯]Q\1gNm#k,?]i.B6CA|Ȁ'G *Dp>"9m!)ˆ$|0]LKNƗ9涡,54˟Y_?ik芯kMsvi2[ Tx\WƹGmX֘чxo՟]}{n1_G)eB۔7kc_lE9^U!'\N*ξQ~We\|{Pa^Ǫ7qRBo96GY;h}z#[ ѷ%~n{WOܯ>) ZR?8%*,-P2>ð46|6]߼,)#Ǵӡ=wf:ߛ=b6;0Mܕ 3C$~ޖ\7{|pOd2AuT\l`pw5 uլbj^7‡y.7ﴻhcٺ[wYofóK.6tKtBDfP%Y( -†ڣ+ ]vvIgG*_8hK1-u}ճSjHT)yr<`QrXk6y!d) Вih/v(́>8_LZ[m7nӒio/}v]\x N|qQ 0>OХx4[eVi0xѯ2?Γ x!"jAԄv]oh[OljWrVH oaھz~)ī/IhyZ&e3~gf z^_.D/a5/{{Ƌt|zyګW(&CAUJ ዩL>\lD}k"PMkc]&Cda}"9͈c$Ξ=JO-UHm,AI\2J+-hs :hJ(δ]]c̪ iRyM [G ;@SεNHPU qMQ)',ZZ ֎bU T[+lUt|+ϲf[j{|l}WD !ÅPT+-Ir(@X$T'+Hb*2*?ܫ[ս (^4/ڰZ)w1IPT 2AJɬ%wNftT%yd[=K oW䔉GAǖX4R|R8Js*o``klǟYQ 縶eLqحk_ũ/!P!7mzW˻Fq5.!JȀ\Gc,r+#k{ODL))$B*tۚaҖ$SϓONeNBSrvpК( z´[HgF&, fZ[$@nbF)9[QJ2\r4& K8eN /V .h)FfC照|L[މ@"p\V;'j: iBiO^;*-D 9-YeazuZ)c=hL^3vݽhi*AN:nQe4>XϼR]{oG*8GwW 0^'wrq0Q-qM IVݷzfLJ5(yH"տG)xEV' 11؋H]_A}f"ݐuvZ _Z:{AFغ2DL4E+{ BhD4X4&»"90[o<(32'k2s 8Cf6Bx-,S3eYׄ(5m-\xei "5  lh ,mi!Ysr,|"g! 4'F͆O^l+o I D$Q)1 LG (w ic =q!rVL:}[qm۞[ ,4)zZB,")EWiD,9zRhXendžqOpm,u'`6ɒ[[}k%$#as(K.OG#8?,5em qN#+Ts"ŘjXFXg-2J`a!v"P;br1JMg04T}L { SqX?7P;Wb0L]5S+ޭT;Yԑ,ńqR9w·rh^ߎa(aQد\2t<:$S*ҢG|< Шɕ:@@108:gG'FxS< ˣ@Z$ &5*2<2G'W KA|?G'MD5@97SJ~y?6: tD#r{_D/fg/V%Zkew]״\.gb_^qJ[Nʾjo/5ؘ~ii.v_^>'FjS5ĜOQNN۱n9+bQ[stԏ;K؆4z}KgQjc3KѸx4tl0snlU[wrSj8Vň5R>9R_rb ~x5nsqK orĎ߾wo/?߾z~}os_o^ѮYSh $r/5_}Ӛ47oZڦi݀oѮ}vyC/([$e)R(}7ƒr\ߜz4%&iŃ"t***vVuwNK9"Bd^1nԑJwғI#oUOJ|95>mEQ؈"F V F,Si69zҗ68!o,;<GȼT(x>D~j &.I \s ,']Nw|ܵNߝ5buq\w4FMܠ ={}460mh 7Qr)~&Keɏ~0 mk0s'ɫ_Op8鹚B|[׊-x+mq=HZ$izW&o?&pΤn 6 K lGr_(Ao1ݳЂEŒp:rsnY2<'+K l<kDs&v&NbD5ũ,r/6_!wsR/񥑘{hA3VoO޾;/4G}r ~`p{v+Rf|{oIqFehyde0߷eQrPqeɴ'NO&餷OK|&S) :9:TgɳT\EBEJzg73H[WB/"Jl+K09[Ԥ:4yI,I '{U7>1E۲ke)'MIkws-YcX+.΋DlspRWT"CZJqdRb0Kt(ApK'(.[f{EF6C`m0AЃ"h "yLDR_9/u FȹU¾8cW,J,|P,\PT6(bc=z9t7~8)i)jRSQsɼb&qiQF*g(*#IE,Q/bKY0e'D46JVd3vg*gr̢+1Eƣb jM/LΘRQ%Qbxol*3#,W9 C 2*Yci$rEZth,!K9Z)")h}{#vXjL:+WDt3VnZINrLH!BŭD4i_[ψ9[Y"Mu \'[go\+.qQp7"5*U"ʌHɄr"J+ n,?6rO\|\<;vC3 l] Ϛ=r:{ lj-Z7 r '#y?>)Pޏ9dux6:}d!ܴp͌Qljmk #n☩ vyx!եԬz|'O P0<^P"D'!*a LB#p7 gW=H䐘 9Fhr-8H/v5#SQ?NaG*}qrTs#*9aLى t,U"$(X)>^$OVMfl$TGRa#Pj"駓ZkSI-c=Hr%!ȷZ0! kotX>,>, NXaYv,%y E$KZ@xN(5$&I%|I;FR'dZb }:BOŴP]Դ@T<WlZ)1؛_¥eyq0mۃj{s@D :M[lϊJJR`lp~Ja='EHGWKזbK},aַ.hpB㏃ ZÛo,~)b%o])WOqm1]w+gjdkv&7AQuA@恹{0,~2n\swWCuLf/n>Yϻ$ K)4dd' N FgȽTt _ӹĿ^:]#}Wq5j'x2~4tދWMݠO(9=9Mr !Kt+˂ZdACAsϴT\;Oځ6bevjRN$uiR24$ m}I$Eq3Dzaq4\ IԜ+ZU_('}?^K- -H示+#ŨK9kSج̪$F"kHhIF<25I%}-x6Ѵi7u4F(rNI? g&kw )qQo$1m2OZ !qK}7:[i=I%$qHLxC4U0V6]kBDhZUZ3]^YC}6:2~saƝuރ[Fhǜ·>.3#QKi[k ^A/7cBrq@֗F?Žv.)szIrg/`vg{jScğ k&6Ī!Rqf3FU Uc'gpWck3&uEox(RWNqu$~B`ˀ!k=1e&&D5 #RY)F瓊0E)u1-=jX'n5/&teOC%Xe0'0sCFkE, 4FYxH!pPjsAvɑWmS,XˈohlrX rLc- iYEZQ\` k ;F]Ƣ]EVlu:U%*\!bb*2%CdP`4_c4 9bvHl"Pa. W8nvL@:9.g !./50g JlV40[A J <<(ZG@⦪jI#sxb"Q' s9_.J)@gZ9 ` lX "3I_~NZY ]MEwF%R92rܠ`5pF^X)xQQ@IBP C`3/=kw(@H*+9ROQgmX,+T! /\9$Vc+:c"& LgwA:o뗽uKk׽,3V̪HN1ƀ/պӉs!u@gsa]:WT.kTP 3jYSϻ uئ:Պrb^x1u@KqY=P۪dpҫhR0 Rn dPgګ찱ڂ|Q cA1J# I2HeV**26VYFmHm( AJt+Sx$[ -h`uvT$ 8U+ɉ᫘w%,ܶMxYI N;>lw[|vz/ߚwy2u1qWCj,זXkp`Hc#. g^s@ MހwK |Y]!jUW@ @0 hwp!YBGm` @;k1-xW3K:vU 9GŌ: hX%Y֨48>p@Zi298kkl#: gh ` eni a"Õ,c<9ye>F/~ )dgZ}A.{m/wY ̂5*)8BY xAmJUΪrݶ2+{уEXY oT[ 3]$ym/VN%Tam \n F`ⰱ04x|ҕuE~v>M:LnɳDP]\tB$0z`2o6:7  ۷{OSv`)jY:ZtkH繨)UFCon0; Э\fܵ5p7LJxKdM:39P.3d5HNJdt A(P`AHP#F' \ c752[46I#*dO "$U.X2Qwksڌ!$bU_.u _v)&d,¬lzH"b á Kq9,Xc.QmwEC@Tv""DoaL`j?.zOls7f[ sa,tͶ,, ױJP9r֗~~JpNT : CXۜ]0ͧ"#hyoQn޺弜g?z4ma]/ ~ zfEzXG-Wq5rOQe4`|񥧚["A\}E+I+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" Hp+_ W$u/_p+'(2KO+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" Hp+@^ [zW0W#^xUK1W_  HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$"գ }ZA 50`FpF\@Ip <Nr\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpEQW UNy?m'zS*aXnNBp=\kYC>H҇\؊aZ_>Y40y˷Yw =\`o[z?s3?>:&Y WJRzЃ;_QW޽pc/=\')\}Jjm6t hbL'p>on/{4Aɷo?c|x|6󭕱iy_C;IW:r.t)d|.Rdw|ıx}+fk}lnϿO廕N훶…$hM׎;U0 DAXUۡkE_oZ͙DLgI>K'4GHf;zPO"Y̼JVcWE,wj%D7mfMxz?rp]P͌h0N(;aeGtv<]u*w!ZziPj?_Li Tl6^RovKȾ- .YƮ{mllm]Ќߪ>i2GyYm\hlۺDϓ^͟N&eX7\W'eztr~%=:ktrۿ]NHTwblO?NgWok۵-t϶pb_ݸx~ϯb,X]4SڎZߧ@xFz|\AmgT|pCT],αlEhQR? 8Г=,mmɚt3 "۱[[%/2*^N)/BkU)&Z%0y͹wPiڂ7E`BVo}xo>t=f<'.a:zF65 :s~M9;WhNNvmB7=JFR?8R.S5.uM ܇*y&p>e 5= FAjp=gze*sA&%C.j[P*v'x/J&f].2ٌ ;G3/. m=ϵ0j3ǖo0|jg^ƽ¼|}Uvf3ܳnll xk'~vPDn«τWW?tT-.7z7>ϡ|M+?>cPE ;^딱s5:a#0V̔V8M,0Cɫ[XZ Qarah c{tCjS~OgE>>;|^q{%qhM܅ˋ5˜w.xMD[D=#K62^DE %jM-ŤJ0V]+&1CuQ ~<9w8O0g:QίB}bBpK,YvHh|fyy Y!>9[FN:!KQU:sb91'bE0l慕ębZ/9,PLdC+'͜wJqh%M@88>Ƃi:d;}?lx玳jvz\اˠu<9?iYgg}.]~w{Mf]wtk?ՆOEW߮Wo9Ltf peÛ׽Z\|wCu~hoշu6|σiצ;^rj#zեw;\:W)?ήiI":T}7&n3FT)b~Hyb_)S/yFW{/_TA 6vZZc!{_*+Xvn{}RzIKaK4`iDB_ç#g)p-%R*{7fw٪rڊ)"//ݘg :#i+)=oIJ_vWKhf};؝ xxql ߿Wmt8vU,#sh/x,8bS&QШD-6jqF!9S27;#jOQ]!wC&Ckl8zov-0CM>o ֏Vff}<x7L=Z# g(AD͇(] x0TR # $y;ODZc=YOy7cL&(R; s&  s 3ǓNhU !2yj"( {4 {ųŴ3OrLц0p*wUmru %e :p ECpsߣR GW,Ż߶]Ok ~C`^AeH2u}xn {˳Hg+pU G4,bUw@"2̳r qKb,W5wrIN>=߱#7 \ E>F L>RxX}SKycsq`^ϻa?\UZ׵7l 6m-^&\^_k\vݡe{͡6E?]lCN"/_qu>Ll^3aZ>۰03T(={Cū[z']"@2@ͅ RL&ٰ@[el  5`VI;2Ch9WNs. D>Χs[NrKɉg":݋DM"9`&:d4IBuDc D:{;.َ/j-|pHT)y<`S$r Ak6!$Ӓii8v^;r:[Í Npbrڕ̥Is{Z<gM_2rURC09ԜYuV,'C7TJ ʩ_.c\kɄyPɻȫf="w9cW}[~1h+Y.r&i%HhdV\[G7(dbL+v6⩸u՟:Pʡ6kgiʹ+ )YALkTM9qR:1م÷[k]uXMhu1@VInc^otٮ(VL%pDp[\\)eJ-=wLKAAf>eqUfw*[ Nܳ>[s3uETQ*#[oMCP ' KzxYn@@4̫`]Tm:m=N"%)C$ 4Y4Ad&/msNⴍ㈛0/aSEj:jm, ML`x4GtVR, F96 IH# |߈K2F"wMpolg<T&Fu#4E{̝i5pt\UA:JJ2E{.IVT 1D@wk196 *89 +LYbD Nj5J5xsVIx ÌN% $#_v!݂tU!F9;iTDc1o(bR)=jo%K lTAL[v8F`fM!dڮj~7 +Թ+w`ϋٯ} '8,LgTE0Wgo 'j^ɣWx䅛^$\z '~5~6l1Qk-U Bݠdg(09E"!˳TIHK*x[W{=nkx~=7u\zљ:_ȫeyLDs+}.'MWaz^>TN5g~.ˍZ-&igf5 i\$d{^{Y9+SgӢiho/2dU`a0Wҡʺq o,NJf /#γ$*W]ݚw2ir+߼睟2Jj>*Eto :a($ETฐ;n, y.pШJRzJ$B60^4X̅orv>Hi#"Y6Zm`e@NZW}h_mH q4jAA{*-X_@t$4xLk~_n:ى%(#Kύ$Z  {G(w4@ !vZ`+Z r%5ӟkm]p / QD5Gb>u=f-E&8UD(D edIr!dTxh&$P͒ɥ|F ynyGF~`eV OEnj|P;j-nJTT0NzXmKTa^ #4Pi J#-sQG:ة-KK6 |$'T. a1)mp5`BE!hRASAS(s(ٮ_k(r[c-2o @ÍaDFaV0aK#06%xȭC:#ca<9uPN, Mk%euɪ??R$r2 6y`;.O!:7eϖPI57"+Ub|]]J{uĻs!&M=utf$Ou& >W >A94|WJRZ+nUGhö\-P|D2'GXJH(h`L#*q>IXh"s3}*049A'mp"+dF(.3hnN ྖ[S8 ]yĺUM$K#ϝqז[%}D^UVEŐJ;mʾX; Z9ᕥ2Yb: F,LX !DH|Q&[/fŻߣ璐bWVqXޗ0Fr-G )C%wcoB48afp⇧{92=ڲ9^fT 9 DXdʮ ЯFGE0g)Ue"#Xyi1q#iV WW3^ʼnzdSsD{apMrMNL (B`?S*LE}yvM[}Eub~88]Fa i0Jqܞ9 "3w' 3I*7$N6M6OSgRʮT!Nl0Oa09Ft˛_^<{<~7/OSf8~_pP0m;۝\/ZҶSs#ڴ͂o1m>y#g\q~X\|z:_#l9hҕ(qy d~R̊ QA<#IXAvט[ۿ¼FM|DNZvRtNeTb+HJ`!8w:8*EBg'}kÒ`*gC_#Z``L D\"Y8ED 8?{Ʈfj1 .]ṋ"HγjXe2w۝*ۊ[Ide4Liy鲯3n:BB_yi4:{6Bw't+^i9wYZ\᭬WB|^1*1& V]8TnPdQ\W?<g}9x7Xbv;?w#^O>3k:,`VT P; #!f&_}FIforLB=vdc(bQW?Gtiˬ;+/"& of(x8hwW;zxKpve jo|΃Wߴ|7v_7f|cݤ۹NLyH+im ɝAӕ5mAI@ FD@%W<-*_ty}U>ۤT1A)'3rT\ŠbbDS{M9SJYfi|vh6P%,THswac|X!O,>ک%Cj<kfq ΛLw?l+E{A)XS}ueds(xJ)<͇M4y;MKE:lzoG<8*eb̛;d<}$X˕e ܼ%6UPX/N{<`ZlP(xQ1d(mCOKYO%:DM-3sVDd/Z=9SKKq/|E jq 6bQbbPחvOT﹨7ʌ|h*7qoe |vsiȕYJP0UxR-|dGQ0ܰk%0?wDRBJE"JPT=Rցgb8 n@X *1 %fGK1H!*IF"3䪑\ߴGP ڪ[3RA[X]eUv 4ۈA ny^RIV zR}95Z)!XuYdU $C5E+4_]'j3x yTdUU.A' Nk@a-yvg͌K˲e- oʝMOnn0uApS3k[퐎x56,DB4¸l4$$":,cm92c=u6>WJm{*XƐo=wjc;ҨAG %lRKLUi:.LQ%&Iո]A?qg5yKw+E?C_nPS0(WK(RCu6*uj,˄-b$Pw#yt`-gK>B;0cXQ7m{?,Sn,MC?oڤDΛQ\LFWM\UIycŚЎX`z4pu4bi Lp RfL Z ʏ8)oMpz5k -v6[( jӁ:>ѲgqT'"IY뤎gm25EypHOU:keM2H`fn+~Nbf4N&Ͽl54łS*!(#[UЎ|f]Y5F/+rQnsLj xYR!bd<&Ξl=iV/W _`s+eٻ`@UUKE9])1AR9Rf]痢@N?+aϴf[wU9h՘cuqŠvF|Ho&;y۰+ }Uz3]HL4c, LOe&r`*҉3ޛՎz:~L' BDZaXI+:j,0 %xJf[?iYg=T׹*O* G`^+/e)`:dg$B39d1D-d0J+f{9(b/zRʓ[vX݋Vk%sTej<(ٸ\3*ijGE%̶&T}K[`lHƇe6꼯BELYct6C'Y.IK(4 =4٤{.V/Nwǵ+bntB{<$8?/ks"8Qg>卓 O^N 9w߱ՁB+ I'5JXRH^t)r1E5;RAg[mX-O:%r V5AM)vab DnV J8o ɍր}jrZojy&|& @DGp.>K?J3/Jki_=z|3x>*>Ԝ4l@Ǫ:|p-E~ꐂv7{iC!ݢbZK_ Xh~Ŧ() UmT."G ݵͨk<+G}c1ٗwe1ڲW\{of(x8رd{ޭvj^~|KkA6BU.b;1gbZ8;;V $#AmSȖE;ݎ?e}YWȬMJr2*L5*)U *F &m5L9+Ufa9k4ڠCLnPr2P!eua_c|X.k?c]Xwsڸd]:X3kpޤoW{Vwۋ<^Z| "UQR IXICCMPUpf&%{{ s"\f@[W(l-~;7ecJ$ndei|-ܔ5]Ḍ֮} ji~2 ( j IU[ M~ v[FէIc YO%֮u$1 *& 𜮕Mj2 񞗭>[_5z]Yhz8m_w}~h1uu}(mvc,LTç9_>!oE 気*ҟ;P;PRQx+mH _v/#Q}HvDS$"+AW=KSCrƀm쮞uUw+Ѫv{?lBg: zu)8j 1m5ޠb#xl᭣DsF" x8R*Fi&rpߏ DQѤ6!dNGFclШNB!ګ;߂r|emJFPϐHD &M&.2 !D^% E4e9N1.@9ѾaAm-ۜ`XTK('JoF4ymn)G7Awdh|''9ʄ3pif}Og:)(*o0$4cR^3& Dnd.%U{/l 1$ET(J!$H>Z0ZcƠS$I*XNΆ 426FfdlNWi ]T_XW,S6ʚ au 9Yf'_7(m7uǓ/aZ ͹@38^PN,(W;&&AME@bQ]j 5dcg6T^3A%.N L#xn͈db j76oyڭL)tN@$ |TNA") )0"PtIHeӂZȨR@+󨐚,2%*Qu!@<Q٦6Ff&2 n$RB :RkJ%F,.v^Y]O=ަhoEr/ x/#Ā91q]6m80(3Y`jDzm.C1B&cȡd-sKk}3F];ף_TYC >F;AZKF &%0DŽunK/Ljա >~I^8;8w%3d) L& MgAemѩ Jp Ѯ6TQ@V0jմ}šD5Hj[5۱ v CRQЀrO!Rc~ht ."h,E "y.4JQ#e _pAՎseL[>Ԡ)>o-m9PP՟o{r;cn:.-%<#iHt{&%k GdIXBí_uq m/3%#S hF{`!ik܍#'%Y͹ϕNJqc?A -a*0&N:AU2B2H# ! |L^EnM0H޲[niA' ֩גfHvOBj,ÔQiaUt[Sfi5W0?(LÜ0O#RD#sRZkXqFa&뒐b1ps#z]u{>b`W|O\WcHR5M~p:s, o|0.~|ua:sI("{"8-b*q>dpD5P0J(ѐ+|#;K׻9O/bϔgQQCJޠ *w6HpL` 7B:쟟e.I&Ry:; lE?^W'TM1LWߦUHO.gqUxZᐳJpȎ񣳅*Uq֜y>^_c'T^_M.,+g Ή_Ż]: AV5NO.lr{ Y1bH r0l0+Yfw(!Fo1~fNj1ɺ)rTF֏:QWUR"zHX c01W*I1=Š>yc)lMyCY_?.|wx߽;{{曷w~7g߾;L7q I9*4 ?<nAyC 8lhau-ƕ)0XʠZ W^Ûdu/[&Zݬte"̾d_O=X2TQMX6B}mct[,;l /N |͉1 }p" A*)g:qk@[ܻe RNz䕕v+rPPl? &uӀ` %e%Qc PG`df3z-"3Юg/8|"XK"J-|d h˽>oWA!\$*P0-h"\Hȹ%o u'VzH_HVV ,_ch{pQls[^9Z3iRb^PL/Vd&WR+RIE׊T2֊֊T%t{E|Kq9UcvE`Џ+Mе}p[wUB <ѰzԼ'131 I)?c2oN8cLrBqV=`=36bpMOf1Ubzs•{W_IL,8 QJ5"VݳoܝY l=p)~^rR~Q/*˭͟6h.ݟ7e~/I1ZI\F H <ӶԨϗUZXI` ptkI-uzp:yeL~ r^`9?OL"pY<< D"\yH$yvh BvVզtHG;&3K+N ȏɬz}z/X{?˧IZHx!J KE,T1FT%ښLҒ`PnҏBw+ݴOZ2MS4@h0'E; [1oK'呒H$.}PA 4dXPr!Wə9Eu%xRiFU'J;:Dሢ::܅f:8R4#)HO33M.B =+E/ yXzD()1)XELp@GB[iyn}/3#Njz&!1Au.e_at p^%SW!&{՛tө zj~; GRU|Am}Uiq2 jDu~3umyY ϜluV0\OSt |oSh0:?λbAU`tJoyITzfݜm'9ft/c*SV\hퟍyκ #Q M#p n0:=8$?_T8REW"`2^]t~!~E+~G>|Sk蜯J̝I|O_Ģ[ϮgU!:2{W(Z[5d E>ɝJSJr:%:]SuRY 6Sxqs`o|:K!6m2آB'~[U5C#:.3gF |{GWi9|&w̾`@Tc{dr|R?V&-N-KOU"Mo}.,})ty};^uyKrY/.k{a*8X*ws~{D骋FnR/SO#@)__ImOS?;D?mBA2>aχ̄5Q134OY)]O2G[ӏ /|dIu9$nVwX :_ xd扜3yBS;`B$bGYܹݳW?xnEkFKQ੐pbF7M6Y!Dct0,0_BÄgK4 FW-7[ֲemkp5IZ%=sVA(cӶ[-lxX򘆁}W%Unc;=p7.LUQwIe P+!=EXkZuՏZ3j'ZW{;j~1hFShAm]kLFnFZ<G(#m@U0ǩ`*ZYQ;5ǧp] oV5p&Z!-Y%x< B]K[]ja=mǦ~-~4OJ!Sb}ۀ[FLq^;g>U }n'IB(or`ue}e`ŌY1c7ce@q"41=,xN` o@:2)XY'׉nΒJkDVi^j  }wv&TdQklMܙ0ڧ[$J`mkpD*څVm-k 1uA%k]>wu[}ʣ<5^/\=V^bhNJηgv3Ӌv;wίR:YƳf- ͋[+-U&ټ[v>b8g^A㪟\~jAW}TZ!-\Wv=cpE l(\\swq:H0W#ĕp҈p%bZe+ WV:Pbj?W8x6v"rtx \W hc͌pEW(W* H5 U* F+ ژAlW$Wf+Rk]J[103H!7(lBV$ J F+K8kbz6Nng?.reA;|g[x:[䑉J Z39wM/'ӋvwtEلVw>U6wncxI &6BP%g`zF0H &Uj:H嚫^p;̖]o>MAW ul~r߈Sk0RĜ`Wj׮{ Vf+L.BqE*)!Og8\ n=|@\'ؓw \ZRtj'+l4W$ײ\pEjVJ ƈ+%L =`HW$+RiEqe6"F+R *u\i+1"ؕ|pErWXBۂ p"7$87$׺\pEj;?P%QJYV#l~^1GBsu$ :{6v__u epy/~_ej| Oo詆Ǽޟ^^xH[+3ӈ~oвJJYWʃ J=zwҳ۳dV󦮫VDQ)B(QU:P^7}NާypYku-]ŌQrӶڸK] {uXۛtsv誗uӞWR 5֢nmtN < vBgZ 2cRyRi{1<)sB6PTy"d=\-1i:qWF,U?@K\ˌJWlծ]%h#2 6g+X."&y\J F+*#\`0 :fs.u\J W#ĕTZɜpEW$zՌ+REq"'g|+WVqE*-/!ijd+$W$We ZuE*]qLj+*ԎEFv+e."Z+RiJ} hQcpH 6/nmaJi?,<`cTN19 [O̎WֱJR)JqF`e6""ԪqE*u=\-1m v.O.8fKc'Oej\\]DŽW$lpr9 :HW#ĕ̂W() HQ::H%@q%4<'gg+M."&ygTڂ1JYY Hj >*-%v5F\itW$K{s}/Tb]W eF"3rWXvR)u5F\YeapJ]'s7ap,17Gkau{ۀZb!^ r6wkael$:/\ӫayV5_dIOl OZzuڹҝhlqʜP[0`ٙKhr|$T ch,H( "ShW3erW&#\`P -AƳI:&+RY3?#\]G 4_#K=^r>uG-CK.-\A\Aծ]7d+,4WlpEj"-*z\I9 + &sRt F+ŀ+,W$W\pEjj\0 AkW։q)J\Wm+SpW(W; Pɇ wl.+W(W0 HHWR\WHx* 3kaаz Neill>N0Ʌl^1Z+D&b#Ax- 8$IpEj;p\J $ۮ%;b<`ɘ:uS뺋j=\O:JWmWt=BsHj HT1JHd+y>,\SW#ĕ˜pEW$t.BqE*Eqg/Ob&\\-qQ9Z~*jԢQ*$]Ѧ> ?6:K9?͙}t~b׽Z9^Eo.B7~ȝͯr^ Űƒq u~Z|o|3#gg/_vWW(2w~t72ޠzE >^Ap6tzu2>Ǔ?9^MkTH? %"w软`/gjο;Y\kB7 3?h.U8ixͱ@OtY[[]o%O*aZ.DNF3 s޷ Vļa%gu$+cs֟_;=zl>I}Ŗ_<,q,`SW~_,2C3]v >o$C1V-@(Vc:? N?ƿM?~8eO_Qj\fnG~},oS4=F C26ȄB#ZMY}mcJ6&x$SoVpGܸ#.K͏E~gq/z|{{ǘśi Z<^6w0& އi Ǔ3>=^/v<+Ћ@/JR,Oq_3Mb}UGfxﰽ|2wbX7Q[]4@ )J鬒09Y,=P49&0 L`Otd-|={Zto>l]3Nm[5fg,N&=lg$C4aS "Q9RDD=¦W9R58~p' Y$e5d g瞜қނ֠Nuy5Ÿg_z=CG{ӆXadȄh-tbun2}M*qzh+JM>oy-W<\^rE9"d]f=DAI0LI̪3EL㩘_uɣCkx'M|2]|(bVΒr=AFgW&׮<7@Vji!CU(V4EPQϧѨ7 c$tETB&t&!Zm V5Ve*ّeyԿ.d$)Axf[TZRœ c`QL &:J1Gsr7n 4=JwcڑP|}LdZ,JPPJJ^aL:dÅ(ʘ2g'w #L%#Tݷ;9H.B ʩwq mH&_%m*4%$,zR5Y_ja6jXic.m@+_o5ԯ,ig!;n8@ %qt&>8~R8i]OhOEFxY%QY0$Oz. vqZK?a~U_6@wOoˤC?,dmyzucU5|,`c|QW墖u NjÃ<<;auo{7~yϛoyͳΟ )pMEDzjho5p}w|׺.oy˸7Ga[[,O7ӳ-W=bլɬxXAWuc+,6~D+_nHȧغb? -M[>cɑ6ăx|RI{-BP0i+кL*P$Rtcy=OkZ`՚u""jp \b!MP(?Q%ZYDNyN':3=R}f<4ЙNlĽ# k7{49wugagwΓw.&\s Sw%uNjvxu3[f L(Ttj\݊Zbbe$Z>e$(8$d!xB!H$cd#IYN[vh,K)3IO.usMÉ7 eM.JXQP$rńryhXCN""+ tj &y/sp1z&!텰6"e& 1gx۱t&C O+8Lc r@~($'g:b{R!'m +@XXg`4`FEx Pt$BGfI8 3X7Mm+Al(3ʷS:z -H5$A22H@r yL6xӃ+l@jw ^Z}<9a F: s>N_xF@j`m~ȭ^ŻxeqT}G9ae_!ݖErm ̺eݨ_ps3!-l<x)Ƿ~|rlT hۍ7@Q7IX)UՃǀlQˠ] `NIbF_»bL5QҒ͞/NGA6#"Pb%&U-heMT7I Z2fq3qvG'N'1h=NWky"K缨Mzo|?L\ش%c K%+0yc9I4F><Nۑ.`|v}v_?eK :@;ݠ7۞<1}2~O;GwUX/>-Xr$ OYYE10LmDE 5:t>M,}a$|]7y؎y?*t"f=l?c{={.v/]gxݡzRuNR@u8:/K@||9`$!DLT3' # SXA@@*!-DJP Dl-Lx7n2$6ރ`rNܱv&nCF5Sf4mp-bg3=sx&Ȩmm-;ص^{}ۃYY'-9Hka\q8&joYNyx+c }VК^)@:XQ mtTP)8:ֻXRuzla]eqXBR })YZX:Ji vR]{*®4c__:O /%6?xKB`j{Kb8h8=b6^klE$rJ#EY%N[.RR.[HЩ0c^\`K"{dkPvgڱ{m{ޥʖB(ZOX!u%EX$T="'LAc]̩1Cfd(YtX$"eH!ZI5;~"[xc_{D=bx.kfbʥDz(J\k!tbr>3a5*NN=bR뀹1ВIT~&aʾݣ%&33i/:-暏۱GL{8!~ՉY!:;Ӓ}//~ͣD&CuCeV!ڔEA(dt\+Kx(3ڎ> ZٜN9^ͮ~K%im=dgJxdVږJ!7|.h4!z$IjFrZ> /eKR:J*yjg_#v.X"Št1ӀKʽ{6d,f :1z˪d"9ȎWc;&©u[ޗYUwp3x:p6^&f!t |\l]lkdYKQ\WOUoOvdsܵ'ޞ8z{`R:HZG8T2b|8 : [£Em^(Q>I Ɂ[dR` !G6%e)]oJ{$-!AjEu~67'!$EMٶ}$9e)w02s 0eug«Z' OU^W&}N']V H+)YjVtkmHeOdo8&1~Ji?~g ERCiF%N*-y.ĀS:{46R\aG B7@X 2jvy)6 JP40bQGVXt^jG1h#Ҙt46q%Xd}:?VBv1ػj, 6xyʂ4BkM̀ǝQFG wR%P9jn g=;FZk];S'gH%pC4|%u%F%8@6s52&Sk=<#-8<7sčPF qr:xSO*M;䴑IHGԠ)l1>vO[I>A;V/{֥E7OEQ>6m:^MO@ 着9.ƟZetwtlٓ,:9\dܾM{ww^Mݢ畖!9[\ݼs9mt~ W̳+rմWF!do/nx{-mwkiҽ}y LjQI$)iպ-%AZeK'))JaIs-) 8? P,)_ՅTƢ;Lt9%7:iь#XMI9~.64~H32t #km6!KMyx]!BpEI;{H\Ǡol~V ٢M~ܳȶ>kZ֦m).92-r21Nx^Hd2rE o8DI+5G Ǡc\ pqe>B+ `Oqz {Z",RgǞ;h}[8);@W$&NHP8[:vp;wN׿yHg}?}/f\|INF^^}һTj8)Qer,tΪ h\~ YČ ӟó)(YB{³8<M`)fdqI/=f#7;f37`).taP^q}쏯 0c}ã 0,~e>3 ?ʨҤITc{ʿUAR|ً7bh0)r0˭VȌ +R|yK뭔"3g??:~G8/ }Xv }9PVF75*AA__?f",z:p t%^ۼu7?p5n_$,otM6Y#v}yƽG1fMN/bBё7c Kmƿ~ϣ 16hdJ[|Ny|Wn:92-4)P9wZںCzB6.JU ,RupqmmupëΠbLK5h&{`lyqκS@=nOg=j+'gmr~u/ ?p80hQ`SqHǝT'  F^ L^aDB#N˽ӝjXB`I(Q2j$1ƃɉGbT% t<||O[ ^n\G[n_=v҇qQ~ {੗6Zᚣ=OR> ZM.>~f]]+>A̱}W` >]X B*-!`5Cl(?wLXᙰ(mZ&(#^>Yon^N$'jC 4hb 'ZቒX"BܹZ'V:|1PҠmP"M$i 3ˌ3]m6q C֘s\˄A5q`I>#]W S3Oܺ؋ņ]/iҢhQ!Cf4h+AJ{Fj&iLR-^nlZYb*ƀj4#)+>ő1BA.gt%5"M 3UmI^4Y0o zቂHPE@Bl).VUCYg)bkKl05x sZƑ3ln%9ӊTAwG#n ٧n^3-w[ǽGV?8 8WVh, ֆ)2T06 σKS(7j<[֑aYqG*R.kexB"`5Ĺ]G&5_;R4>Q5 `Avn_;HԇqLeI&)_E T9i1c:'!Nq5vV yE;Su]⽢fc%K,P0 Re aɃr^;t0l5f{k,X샥 X`-,:Mhm6q橴C󿄒A*_K9A y (#QLAaKY=KyVCgJ€g;ħ'P$Ϫde}³gųOf)giyΠNˤ'hce4\f>.O\=)Z* ,ek*e[*E7t*bxWGWDhEE* Ek ee6UndJRʎ]#\QI4-$Э$n \QIѡpd#+&m+JUV6c+mjOlUURg=5[IZxv\#\I$++wՐJ"thx^nA*02qC4/eox<éss,3D{L0?Dđ܊)ZI tW&#I\y4h'״Ep ޞ7I\W DpK]EǸ{q~[чh:rϫJ<b9&Aa\?&U8)0|/C`<Ϧ1U%]_P' 5cͫ^<10 f1e׹Vpb#Vu/ŊX{?"-CM@soI s?G{/=%֣RMY zHwuqmdUC@ < ;/.~+[$ώaKdu[R,:E\KAxWctYzrY ?g2 o3wsrq]vA|: Wꗖ\}NlZsw߄>09avDTvߥΙh1i6nec:ZD~yd] }K GoX6txΡcϧD7CWW&!ޖGeqߗV-B<G)]=uEb!` ]MD[{y$2K+붤q]n <]y$ ]gC/DK?t.tLKW>ҵwIgj|~v>==r{ۈ6~'4<..^d`W7%损zra>9Kcz/p`O?xOҍɵ߻pܿ_{qCGW9?#?1o~0U.PSٗoP_ #PQüܢz|- 0Օ/wfg0?} ~!Nޠw9r۽s }ϯ{~_/~NqI=;#L$n}6W/=m!ocLūK/_?o.!9 !ד $W误u@N޶GVc%qLIYȵsEl6Bl%9öIݝzVBr%i. yIȥ4WAxaק~1ƥζh[~Zh|mC! L˜dS %Ѣk#D@go޼ IKcV0Xk77*PRTt1#%IJ+έ? м`2cs"1T7T(:\Ө%gCnU~{""9{|_mɶ4⌱ـ{І@am4ĎM!:Ζ20T`&!Kh*ƈa4Bf/`}G3 4"$~"iNs6s z@֩ X2,>©N;D> rp\`笪6^{r} 3f0fPfxɖ쌭13 lkaMVz7'ߓls'α'9fqu$~aSƀjڄR\H&XK!Q;ӈ%#BRI;I:?Oђive-+JM 9ڊ>dg݄EoGZ`L`J"s[k ]5v8h[#Ҋ :eOhBJ6 `5r(-ψ`栢R|7Z88NªaD9sMm)6A2h y-#1kYT\Zld؍,w2XUA iNYk$ )ٰQv̡*"%roPPCohe:gAuPe0%AБmJHKٷ3jUe 3i(yw䄰 V/^,J肸QV24ITiy],C0L03ӥe0F\(^b m&PTg#VH܁m CǷU]XUBNu~d}g:]OU;1] l;V+.$`]0Ov?NOs ~5"d&K8r7Xk3tdDtHc.m%r,ȋU LɣB- ā pАa6( Xsv,!ⶦ3*Z.T1S;6@DB bA *5hIVՌ#VF/ֆQXPa8#JI"sQ:7s2XdY$ fZfk ה!P?Amj DQ؋kzs+`Qy aAX[Ya3 R}9QS#hU"Mst՞Ewi 52y o; [aen5{%4BZ&F[퐄PNGçQ$;73scHZo2S@Kel8^yͯ^/`tߦ89{o8$7AK@H7({ 6#hl6Ҧ `_[}$w6,Fnͽuf"e-6ӌ\ IoH#lk80)QC^"$I˩4dsC臷WYϡAi|z:OtvtդW3؝ë17=={廋ׯ zcg/ٚNw~~2>}R/bonˆ_Bۮ_OO}[^0~'P]KVb#!THhUb6>(O|~:hSlWmS;)_'=x]. ă OǣA]{ 3跟ei¼S[ٴ(NA93')ji42.؂g%,Cz˳#'Ł,)q"g  `heӻVަ7es/D,Lbca2)ϡ۪9YeEל[t vٲ \یĜ$Ϊ:Ÿe>eBZVHٵ:Ild뵤i N| 2Sĩ1`ׇ CmΌ-iei>~eő[Z8K mZ E=+רn>8t%7lzyL@%U-t{$wzhۧs]CR֠+CRԶ.m+D+m QV6utu8t("B{wג֨+DKOW8i]\FH[ 6eGWIWq+LJ0Ai ]!\BWVlZǎ ]Ia-+,ڳԎpek&V4+,em+,3D5K4^]!ʦ#Е洺t#׃i[ tBd0JHռLDTOҞT0 n@ݼ؆ )HXkFFl҈vҖkBA+B5s`DkAPR-8t6lzuL1 ^ e>}䭇Vz(vAW6=*"B.zhOWRLV+kI[ z=]!J*::@BR"[DW-thUzЕ4j|QFC&B:P3JaK3Qa߳;Qn: )&A [?)b?/a_n thՆQ78XE@=< ?Y;n'EyN;9:9ZQ(ƀ\i ? T PƗK}Qa-;2l"M-H(t c)=R 楢x~t쿬>R0܆|I?(vrq6yr3GӷMkڮzZю/:fE}}O}B+nZ$fjwC+UӥD6`?-th%o:]!J!XlW=ʴ꡵+@JKA+NF@yXOJ\U0̊;[ضg p-emaiDiYQՍ ?flҫUFlW*[s`_d`BM9V#kkѼkFX9:mkK!&4LүI 6F] DQeti9tbQ)7,c;K30CPx&Rz ہR(=@?z,B??/yu:Uz=oy>٥_f:$kgYl|ye=ma.842Ax,౳5Dp͓Gwf6'w_Y%og[֬ \ڲhU}%ps;V۴[ҞhKm[ MD_]Le{pVY=atkЕMO+1C2tB\ututŔV2DG]!\iBWV4+ۤ05tph ]!>a{vJhi m]`KMk ֨+D+d Qʎ6̚=kWwQUt(u7ڪ\ʳmOG O~{'YYbs=x ^I )j$&'41QYj : xJb Os_t*`-WH֯?S 9E96낇I6Go Kj/nuUgЕZ!6pV`lϟ@4]q ʦ)8UycU?]eğwyµ"@@||rB,M"RiRM;@դ?}q;W\} D[g`⏓`\(!As0קsuNfb._C,mz] {W/,M~3 d: ݠgy{)٠|χ+ t-hONxV]%;)٪_`VjE,sJ8HR/k<"P x,K+ YZA"%e$>\($ 39wL^sJS"o,PG3-&>OELm`.1M0=˼~{UL6!-/ڹ7S+/Mf cFq,,f[Gg! ʇwMc:G=? 8.jbxCQ|2!aF_w`y \2Ab+_M79)JcV[j&}( .M-vU֍1;l7ZF›a~`ܵ;#%uls=)`[sǫGx+xuD%-z\YW`-t:uQ O{V *MQ2᪹]¬>\=E$NWA(4JXq!V?|eP?s"bb&PcЙ?.27ڵ:z@E&Q;H2kK1z*׊bhYs6,Ʈzѻ-}=>T 3+jl1yU}dxk[㉷C2ŵzgqr]:h px8cǣxOC%<qmyۼ~WjWdbB޲XX- Qx*r U0"em={4aj3T$eW7 447Te!J;I'I%Tڕ :iΝtZ!^$,ב>`Ehe}[\BD\w?Z,vD"S@\N}%)sZ'  BijlKDӥF {7`ZLS2傩 ۊ93zROTX|R+}& M8RKJfh,+/]Y u՝Xg7((̮| $rGzi$Lt7 Oc4קubߥnrH[A/܌5*)z`ڋwʎ09$ϋ? 'aխc%[RDS@b`/&z{ѻ//Gҵ ~ӟ 0ȯzn.|(ݨLI^ϡF4JxĩޘU.a^dsCo:M ؤ8]<O'`>1=C}N )/2 G|:L{.O4p?cF8'}[`@ _>dz,x3wBKz?#`Caz03":>z,xVTI8K4P)grwm$9zI` ~ۻ/;,Ls9h>_V{~9k7=k-L%, gÊ9X3W&!,rQa)8r<>FI`WÜ..i6_&kApc|8ʥcEP:jn|Ð]ݎpʸ7VVyZ 28SJཱུe=xI&(i5X$GBTWur\W-ӇUW=H}c_g,/^VVP\t!rH!M>s ~ADU"veW]ICfU8glI&{Mj0*y 2E%.P"fXo@{W}Wo?ne]wgz#fe6_l>,(s5 LA$?%o\/2m}dnϱl2<3U6Q^1#[Cų$0YA6؄)vEE1VAx`ξ`>O0˗ +QA*kt(p:]df%4/Hť>dVWTw+"mեw芭:eqGB @fękqM-β#oP)ܗM\*HDyjS8 p1ˈ+~F0so]62CISGSP* 9 cN8Ry.EkI"!\S"f"`Sʆ+B]۬$6: B"(ܪD#&%zSbT !A[F^1?2C1X Xy& z+4dX*x( ;ѳ`;V3Ecc3*-vֵ8 xn nB[-2xG0h >Q80]phEk_/f^Ϧa _q?uH;4rA@2V:^+SG6K+a2ŚTɘDѹ4U)oX}}C 1R@+=CՄd]Eqɣӊd,94m8NvT~MYǹnX UMߟ}z^xc̢u3'«h/m*eZ\#Y^=;?~{jmh` w`` c`XSk˸ ?$$&hUe]vd,@A[ @&el-NEW!fh{CxlKC8o+9ًDXB .UX's 81K$26J 97Ȕ ]FE,]oY߀.xK}_~}. >KWLxF)4ʳ\MA[q/٬'0R-]}A國|a|_ߗ\՗ɱ7 (^{1ժ05 9E[GjߴfIp)P%;Pu^! $Q)%-`={6{4"m>ӯGiY8GDl6L @+y*;jrq|_Rlp<_?[`ql5(Ae9-}Ы >omN߾z!7<.^Ox:܃u8?;,[2m1v|eE7)[x#|Ox++ 㥯)]~cwm]n8O-R!aȿ-v7=,oY}^]ٕ?ܣ׎[Zt|V{/~['+8u˅OŵJ4W:tډ$ٔ(/7a|q-V4xFa@Ծvk xeo Ve5d-8]EVVOrQ2c'cNI48N̤]쫎R֞w)ś.!cs!3*ZMBv6+Jnj6 tm]Y_YF ^+G]n EUg EZGJ\w) >V[ -*Mu*)h|4xl= E1\|86nlŭQ)&)cH:D"z6N h D moPjQ#xYE+z$ LF%4pvԳQ>~j*ub8uj-C2ͱ5gfR*[N?0 ,pq`3IGxI)`+%2S;rBy!ot00R֝=Z]sHYF)^fbK555p1lLltn6/9uL3 b:;E+e)W2D J~Ҷ~匆CP|wPqL ޢ[Hi+wEdڔJ( O}\0w>tHoƉL|=K`-Z dLSD1yނ#!lb(EJZ vMJِ:.(P,fɆT֫| S#МX6_|"`ѵ֭û{!v!6'K 4xP:onjs8uqn^+#FWHU=p u J>@K|8M!X6jPUk)` ]҉UqZd!p#,VWӌPnh8Scqٻ]<L#|tMKMHUbXJIu*")hbx!3)o']oAw)Z~CCdLk(vlƷDM ̃7ʳ4,}貵\U}{:&]r (N}=XBw/8(X0CĵK!yF* |t|5ڄf4`"65ZTAu#P6ek3p[*ۋK^tYOh3?t$blK{Y*YϏϖIm{3/9={!m߷5$q_Fjvϕ]2S^nGٱ:r$y8jS?#a.1O+^n~zO+jbxմTTFR*㦖*aKS[t2.g>R-U Kt: 8ZRKp'vZ(+D ȹ[J!bE1',(~h$[䈒2h"U#F9xCһ}H `a5dn݇9& զ]p-Rs,Z̘I-;]WE%j> `,8AR ٖ q&F >V b鬨݆CX!^4n68nrgѵr8W.._URW='[ご1ġD=!$|-pAVo,.UEUcBWcZ9g^ؚC%Be̺&jm(m!y-Qm geU^-L@d ĺlxCBb]|dy ؄%-c-.ǬYd`BA*βbh44g/U19k4>o5hŷEb"^nib&ZmjjO.מulłHFDxɇ,+XڋeCHٻ߶nd{>2@?-شXbYr-{;Ïزe%cg~bH l!f( L"YE%ˊ#cb.&F5ah]J9̨_V`Dl""4FDG{KH̸*)TaS%fWKܢ$k)Dt VAhm{tmc2% JT8sLn%QK-iV_t`YL51"6#g 8`۬:#FTo09 &̿^v5,|./-ǻ> ؊J:Yڤ}&H@u%88Go\t&:ËɔA䀆[\[ϲsVIͨ$X T3 &&רye_q:$GGh4Ki?\!S~y^i-fy3$z:ͳּ~=NZw55Q꫕8_yXT~ܘ^_gnͯ|?⃷וĀ'/9 Gex|qnz$jϧ3ۏatr4X1fH`d0j0+,Y337XbǣEO<>979ĕ*qIu\jIuRFoT}>a* ~-`<1~7YP~xw߿{J˻Ο L)ժHP$Ihn ??Ck Zzb)(q38]3ߎwǓT.aSzrkU|\ ף 6dT00o T^蠍ɛ@H:i%W6_M`*0}d/we3Tw}1L9tup4W|jÎh}l4h}:v|ȥSPrb[ !/EMFp^Pقd A! lebJl˚F=^"k5!7nӬ{۽m"_ó}mQfkZϣia=] F7ݲ+h7'u6p-R)}-+m`ro઒kľU]J%BB)ΠM \Ur \UjuT:Why,wmMłEFݛkLL/bu =/_zx:5dkc::<72 lxK_o>o뭹:n8i3vL(KOGa"A枀_PG4HMNS慈м'!-Z?Z%Qd016S/؍Ӣ'd*1uJ-$jJ9[uO{:|:>=TK@X8B1p'y3E-r@SW8Voci{65VlڂM7K>5qHB E=m ;ܓV٢Wy`t3`D #jeV`&JV% I5vv7#ag7cם:ڸ׻u91N?dZ*$_G3玁FiLԧM+g8 [RB`"GgrE]J)`411%Bu/wٝ׋̎s_oyസF6Ѓ -ys~7dؚ}I':T UX{Pn3}%CaѠNج3@lF!t)x!;%6Kn͓{G<: ^n\]{UT=Di L yZaF t6~mI=,:iQ TL9UdWϮ4v+"ܕ ii/9J_bV )ft cɳbߑBi%Hd- `Xv$ D儋NzAy%0"rX$+2R[QN>iFΚ'B9Qgۧjخ&\( !G*I"1M4P,"/J +m%D8?Y I&BJHM¥Ƃڌjuwj)u-YŮ4 sOL#7=e]WAI/ouXŝyr:(/9,Y{۱|wȟIJJZS䔲^j%J',DR_P`SdK%FQ=ֹz( ʁjLTA526#adlV)Xz,*^RT6=ř$M/~Co#6y0AqZ1sjTaH: 69c5K^\#Ok;;.9H` ^N'  5#q׎. Cak`DRF YL;p4b1f7pRͣrݨU`ɱf<|5 >G8Vkr :ŘlT7ng7:7?~Ϻx]LTۓ`$To N$|'׆温aTmVM>u=[+dfC-m֒(=^N?>zE6= ,O=ab$>^\EIտQ%Bp! tu ߧS[A^16H[I`v7+&ۖ t8gG DZ#R͔[D/kv8D^^rϟt8\,ĠA) eR0&XԌP g ǔD!ߞ4rStt(yC!hӝi~)gZՆ 9 ͸NkOէP7cvء\uQ{9L0'Q N{Ay$"kY"`iL@&^9z@ H!!,PqP9 \8]j=wCRH!:B=b+ t $Ay5 2jvy)6 JPg{|4Qo Gsƽ5|؛}3Ǫw>7yզ_ќՔ(F+)'m¢JK C`.yA(*6Qv) 1E 62(a|b0XJBSEF8h9xx8uA{˽)ֲ| ˒SٞQ'?Y#{ F#(9A  2p[ GV)-[(S- dd ^203 y"G2NYd/~y8(+asd@ L=q p4AXfF&׎#o1U?Z?uV S4{=-5pԃM'c7۠2ɬuv࿋b[5 sh]>jW7w9(ntx=y6e-g(mwizh|su=/ܼ+g}]gfo}=oD,|{GKs&vwhsG^*0r`y1ֈ+A3*c -7 w0L_2eb n48$e&n5mB|'!} !nt Qǝ}bۄ#3E T &"(qh% !LhR< 㻄7v 'x pKUQ%uX:eoOkF(lՠ NHpF(Ja-BMq%#I].EW_?:uh˛; 雄<й0c׿/ͨ"EA gzy]-(1:S?NRI~w 7릻՗oN~h"pT~,YR*e7O7`{ijt8ٹ珆T#&ɊwQI/='C7집_IčrI jp}otPG숡csiz4W*VTaZi%6E:XCy,E֝N{>{4-647Lr"ϣ D0@/IɾzE)Zs uʟқtͺu\<w2n= TU]q,#X!;d) XgR8Ji'O}ovpqTy'\\Xn@tVȌ09ܧo|:KΌefE)`DԂϾ#7tfE8/GD@YuVR?Iq^iM8b2W޿6]IݼBCI$X$1?FPQaA)ǧLSgDёc %&OcS0&[E_m*׿7| [왦&|$Fr i[7'-}Au=T̒ILV;[d;6~\z#(eR<%o>m!4kY̒ фϣөgnsyvy٦$V{V[A&M(Dp9O5d^Z!ˆ i]͸؋%b_,QCj |W԰FK#Q2,&1JIc#3i=6VgA!66vD^ #r{aƒU8kʡVku_9Ե[1 1M@2d0AQ)'# T sU#t?jZ=ÝflO we TORA|rTQi5NF%Rt:G0AtYu%|Xlwv7Gy_{`vakRɜ: d|$TbjLɵA!Jm8|cUV0pou^so;uڮdjPuu%f&6\+Zr* y a[jD$BnKK vC60F IjJD舊9$6$UO?լ* ]~v,ٹ g7$E}5ĹBź"QTp$Kco  {evK6 H#v91%k^gdr) vɦĹ"괛;_)֖A$Cr'aQv[MSG b˒'wD$2ș**qH )MƴRYr w)q!X`h@zz)U1 'yUm5q,5-; fDT, &!A R`L: 0Z%Y&=|MC6۬4*CbNM—]Ud6!Fvά&;Q貊8"ꖔF1粍K6&yˉ)QzboAdR\%fGvfdFNX'>YfDoci eq'_m,R_x_ړl> :cZ j Kr횡N˒>b*nlvW6$K+4ƄN.&)]Je21D+j: -f!ȭ V9y>r]BRE>} jiT%%4Қ8w:wP}lx?㬟!ƒg(uno]b|Wj.g4:i5=U%j$x#zIuMAUk@XBTCo8̗oT&{žXD_x&%V m# Y˨c[coYnld=2RY,E\(:O4bR;a,FR=U;s,;i'ENٽ6`dRmJjdev֚8wƮ~lE-e7f +l6*MGF4N<7"ے`<({h>~/;#Uh$>hE:dxMi{ Q_~~[xJR޸,2g9\krKh"%n@Fzh8:5x`Nr·n[3LҰk}&re]Dk%((Y&b7%:&q!Μ|1zkMEj|e[Jy&-XE ,i[s˒Ŝ t2yhdϝҐYǓZ=5eoDMe wK!%d+D&Rhe/s#])L@SسZ[TO 5x/)M^<{EGfgzC:s0~M`GkvWo~_N*WWy|-@*.4$YyJSlY3S_2MƳ":?e%%ԣs,tz@9%_V %(EKlXUżנz8\bQlwqpzjVg4]3?g ~ZVH>65D ZQ/oϏ*3ͬWIj쏇=()yq9l^BHgi :m!j. 258kHY169jmA~yN1s;=3r\kPhfrj+,O:l1xQ%G޹N~A+{D.m9xJ.emm'4}k;!c_Noy=xjJZ#yjE`}8ZCԊ(S{WE`&ffWEdd=\="; *aWE\iJtpJ\uekbT]u} bu1>T/ל]ŀ r_+J՛__PM'EHoF՜/=%l`~pK "0ڃiW\YU0]4=L?EF@}PE`u8NpW̚]v>H^\Æ+`wz`GLq#'.>vx-puOi9):H jѶ =WRC"?1S==W$Yzz:p%6B \q8*҂:\) UVวpUI\wRp V\UWUVڮUzvJi͙9 *[s0pEoϷ\iϮBpJk ^5 imHRS+z Sq9tqioKx}Q+s>FݫG?t8c|1q<"Ylmh`.8mMd<͇a|u,KQgtF?ngHgQZu~u^(~m'>٩cGiLyOFWZ_!㖠%?7;gǡ j~xvF-ӫaz _)h\7? fVK)} u.Z0Dcf5{?rUN`uK1gXg IvQFޏ6ؽdD^4I Y(©a1LǦGĎ/;,TGG^ݸVS;@c(+loW]4҂1ˮPͺV(mG;韑ZUHkTח(`\1x _J@d[HF) *X[0Ud]M"OFC2ZƏ;dA"2I5&rhLsIscTd{-_hab,}2 I>e 'ipQ&(J؝dKeܱv #a]~UfV}̔CSf['>I I*+I}ҏǎ3IrLOv!Iv+wi<3q(.Uiu>n殥: ]Jx6n{ՍV>#0ײ}(F:ƶmQl*4>6毧imQlݧUi\ͩ]/cj*,U%78G~Yh/xr1RRmW"ձ.Y:+֙tbAe%I3/F/:]*Z@¥֦ktƩTTnv-M[jq\ѯ6c= -P'k\ ºb5xUz:\'Ծ<\mIv \+) (MsmbOHH\h1gK%І#7\PGQ`h'~2=9l2$x2.¼fqs}Id> Qy/i>x;gtxFHyFלsgHpsGrwefԘ(E_aZEyΘ S4MRr|/߅s'kFא8>dUgoj&v$›|8T?O %hٿdԛ7?8h~UFw[/*(6z$1Wԭh82FɅT$%ٻ涑WXT kzU[&Wf+|Ƶfq-\R׾H i hY@cfF3)6%2CS#{F*GZ?ߗKKڔ7ku=Kzxwah,p=i4[NXy%w30ݜ^ǵ+^kipEmp2qB\ӿccGx4n94CЉDX|ܥ ˗ĒG8n_*oI7uf:yUp ojDk[[ANdG0DcxWMsq-TƽMNmW ]gG>́+_Im9..2$Ϝ(hB~G9D\GZ*"-TT^G,^XSCZm9;5,s޴lė O,ʏ'Uwtвb-s3jSBh]Է(N~_>gϝ~0k!"D1!slLgsIi4|X;OKEH"c6"hUb:B0Y4$"gKECrחKB}Y0X~@W+$ߴڨϝfy1dI̠5,g#@J蜵FT),(n|LyL2C~[-֯u xosc# o=\եׇPV;G\xjv{w; Z-vƒ*:k9Jax$% 䬒L?<>c%SULs]5ϫ;I}_gLM M^_w,AQ&acѮHjɎd.Kw̡tT! $:X([pè|:<=Lܗ*Rd#=g[)bV A- t^RWĨ΃c1o O-쀃O҅"h-!;3LfE@d჎S(sE~_}B[%2#4)( Sp`mIkߎ7oݥ!'-47XI2U!׮ a;yޕOsG6,ɡR`LB`Je2p)c+BS1t2"0[Fu!.#w VGPKNg\GREDgd=Fz.S ;:\/kY~z7B‡օ wUS>wߵ{=^;`&iZ1^}&5wzᒖez:>?~{\&*VRQYgTe Dd6ab={+b,/\@ JUY؅.+! 1@p2>1ِxU'^g -I;yJkކJ%~!0㧋f܋CAsj 0 !|PfDUx`~!Ji_p%ޔBOLfO$߸@ٸ$Vq+E-?/K" DYWj|}Z5N#(w{؛˚ܿY=>C_y*T˳[b8iZ6w7C:,{:dGjewv^{B HWͰ5 $ xly[:ҺjZH|?pgo>۷ս|׸Bk-wχ_{v?yr0 zg֛Kx}w$஦fmC}e?|HxH\cZf=SHZ:%/6ތ^W y,#Pof4Z@͎Qtw/>?ёpL]1CA6m1s4%~3<=N\n_'䧕MyyL3A.=ݎ qVwK/^7!/q(:xKJDv'mۦ˦EUlc tn )5ն3WzSsf毗g~};h=;է際d~\ w;MjWsl^h66j7=yߏzk׉bO\(jJ*<<7d'Y외P<`.q3Bqذ;f dTƜ Qx#&:V"qG3M)!Qi6 O;'WKF?ܑٟ;Rl9^_i}gu]+\}D Ej-ӲTc{=hsZqhep!3SET :ۛ_zvD980Y2S4rلCy3OjʀXiIo~4pc@9saqqLhZuZ$:ѳ&΁zVէ@vLT:}6+ rBp޹dk:\t 9%lˆ >C^81@*kKC}~4UHG%}4ɸRS;<:j9M1Iϋ{C?S)+;[ɣ7y&a%ZZ1!׌l JMh%ED G?6mNy#c?!~|2- A3+Y+26z$1 &t v;OUAzq.9Vxײ +JIi$4 %r\NfPcVig45CM; hj&UdgDɏkܸgʍsVMAYg7%}tQKWj.{HCF NWA%ʣtCeDcOʘ9itGY3,@&*КsY#p(-lQhiK҆ *!]Q٢PlPYVʛltYKgUXkCޛGц&~}O'8WIUIB)]ͭd`B%Rwf B>7+}@ eHQɆ DQ҂BL^$2U!BW+!p,lGA*-ONBo& 3ALpE6b}l"l} EJ~KN [<&ζNgo?. ETH!GC N/cd@B&to׫~Y4)3u8Yh #Z!+><ǹr&-r6>*e:(%:2& et Y#rjKɸ,̶T׋ X[yF%4p&5mY2?КSej4ڼ:}tf7c?wIGf} nۂ _@Hqxw TvtmyTC ?㭫kNd9g;,竍")nulw ?@gzRr'ޮc-fmwY\%S֕S+L!hb.œQ9mܣ}LYj`%YG&E@ Sad&2*5F>Eݗ bzŜ-&%F2Jiֱ LL\4&}vd!}⠀?{WFdJAX`f.)C=kc{ṅiCS"$նf}#J$UjVIe:efˌ̈ (kU$X 6$ %ſbr,АzWj%7FP%ƖjVlx:|aZF7w4!691Q`W=}VorZ@W4S0,&nUR0z ]$)XfMt̨1f x+)I#uֳx )Хcf<̌Uº$cW.9[ Ě\xY|W0K~wOo(]?'Ϝ1cs 6fDI9qHN09=Xf*Rd z4{.!x8CJ%Q3ūMªR&A0c MaڤcW56X'LrfoHT&٬Skbf`IiӢ4Y*7 afHXс' hb"ALP0q/YFN}6Qmpʨ!16k]Q̈cĎ=jTN\[C$GgFjJk`,ZJl$|3>`y2!% @:̈q,AW{=~|5ٷbY劾 ~ܓ4>?ofpP-ș(%Ja*RR|GvLS<==^Gk Utyʋ<3h1<~Y̩1 nS2,95@l[s\6v1i18˼IZuv_Guq{.fXO.&eJ-Lʈ#. =FuLrqn=bĒoBۙ0{2a4Ӽr:١ǜ$Z b|ftIH0Oմ˦D5eYWxv bSgHcI2䅲:LtL°Ę IVDaRkRȅP\&V4GKb+fXi~xz[w V)% GƚD$y{&egE̝`d@ M Pe_14ư=Zج\JLk` 3: Ndc y qA5@ ah>V{='a)5e(0& "ȉczmU#$WYXNh3f]Uj:iMIxe̖NQ2K)?C$Ie V[fɻVT?M|.;x:9 Fa̹~lڶ.veDmZy~{kAȚ$kK#bV(bLaQЂ؊8yCe{*Ry%ו,u i#Xp՗c\J=p~॔{^@&~7^.B O?wOoN'o:̜ۓO'hI x:SGAQ}[E7/ZҺEs#6)YYoPuVyMՇe/Wm/,J8\| ]$]lY`p҅t=E?A+kw:BrcE!My7]ݓk#]Mle',`o8Q  [AFDT:$`d0nR9啕 ?+4B+!, ēV04;T3}Zuiryє8db8 ^\~߮;C!ܨ4:1\w#4Q |d!c=~G냩g=t>]=xz? #ql?{Z͟q(R`)dUJSVYmWrp<:x5s/*os X'W1!\KCg^wYt86Y^2+nݲ^ӯ_WCw1}ʏU׼yGͯ(0xfr9{ oXL3G]N fvqӌZ Xc 먳?o74㙓h݅|gKSX:cɔ1r|sDs|w"|HH"^*OA(FM#=ߊ* ys> 6pcGs,`yːY4)E< ^ei- ΢SL(41ZDzch9 Ay!4rCRdJ3z*sgSwl T3l:c"d<>!DÝ*'J¸*t)eۀ[J(dNz¬iM2;.':Y.Oӻbn$RFA?] wEݻb]*fFr pӾ]ut#%\FaywVVmh<=;U- ^A7Xki]!`a[CWWn<]!Jm<bv=0 Wl};lw~rۡz?tJMȶ+ծ]OMtl ]!\BWֲutÅn]!`KZCWWB4eGW/8 U&+KI[ 2tBttJPNZEWwvho:]!J);zt%ҜR+۞vDP;zt5tp9k ]!ZxcQn%ҕf6ضǑ? ѲϮ%fW/1͍B`%3H F5#JQ[[YAI_UT^ YidzfZ igI*TB !('~>xTV"^}PEUdU+hP-v hxO Ͽ*Z\AH[/Q]䮝yeߪZ]'Xlؠ`$X\]\YsJ0;#g闏^ب%&+3Kj)` s2Y&|T'kNw`E/˜\@l饷阸Q0yT/zyqѝOܷD)Q UiJ- KL/+H]w> 3_ϻ2Mҧ>,U 8n;fi֚DkEӗ>K\Z0N[DWE׶Ƌ}Jօt=v=?$.싮,͞j;ztZv(m6nt;ڵ메RB?%g]@p"b7ƛNW׈6%]@]m卧+D)xGW/z~PˋXeD]e 7 g_5OE~9^7tB?#BVӋd-yOyoN(U:WZ%u?tO별_%LJӷ/3d8Z[o/>2MUe z5 JrM_{wFI~:J/^q.ޛ=7SkAjqwyO#0UO+lqq3lPE %Cf nR6wϞytxsCEW9rgۥBoq*ch.!)6~_3Ǥ:69&L6qVfNTy%2@p%O{L 1&8ȇI t~8JMٯeoӷ9\ˎB,*ZLG4I΢6\ H.5Rک|;/ID5&.pN\$\KB,s1R]r\ǑmM#P3 fapSBhK l llNrAn:yNM}ٴٖG#ٚ_6NqL&c]Lj`\\X- m.qh)'k\&8FI5gOry5$R,zu>Z- %#BEKe3sƜDhgҧ \fh-d+YjHyZ ='"");?/4:cl63(gԍ'M#:+:1T`9MRm`1ؗ>g ŀ6nLS#%csQ"π^'(8B~]4\6&Tu=Z:[d+%s-R['16e?7!XUF3zielsf b;FL!f8c[ u}v65e[rZNex*idn%qu$~_a͔%v1`MM֌8FQ;=EKBFSGCXl=N`1l~ռ|k>$Ї4aX)Y=7k0sK`k "J_k ]5v:heX3ҫ 3:S"ewhb~ lb kX[ aAEEg `0O4/MhDzH(W %J XPC/K *s¶I 19 YWeQ\bWlL0=Y{mFe?R`T# gሤndgFFC*5D@K-qٻѡ*_,%\bNqMmrCZc 5 # ʄ $ dQAQ ~[JBe|3b9\A2Yk}L~]EGPB]ٳ܁vBnTzCݕF d@PS(H|BE*ѤEbh 1vx{gEyuABR [%"#pPBy&Y,]Ȉ Hh]viĈV&4FL!TMY23a^KMICIX@Q򈦉t8!ce.`ߏc;3¾]ogO7r3|_ >Col L0c63ef> :p2HM.%L5BҵLŃZ2C6a1%O Hv9!.V".B ) >@ E&rZ'd^0P>DO:#./1ZdgxILPc $kut<oĝ 11Jrj@#;3PFFbޱ"xa;PMuh 1H_2d|~W.!Oa TM4`U>Yb-=!DCm>B^Z@mDebuj W5M pOk ) 1EG" r#`UߺYOa6 ~+C %>-NnB0#寺țbCw F-T١pNʢ#:QGI101x9T],k~x"mLu1|E#m,}̭@2jUTAY[nȞɈm+B5R ro]JO%"oܙAwUdͨAe"U'X&rLr W'BZ)AH`k|TlW=zw>PCX1c$Ɂn}{Ea2Xê1[z̓ri ZmȎDЬ;~IL],\9)z+-, < B7xD;)D0B!cG&G ͇(lX0 :̔=fsC=Խ:= YpY.=*/X;=?ͣ_`9^Rݴ-4v+WWrxsI[%_/`@?y'xwo ڋ |˱›wwYs+twu>^_^,{MxX m{tq{Dm0#2CWǚwg/=<]$)''P2lg-h}]py3t%4O.(9(]!]E&CyCt| ] \lm:b(uJWgIW=+OpGsm/CWGLƜ]tEJW_:6SX; uv3t%pBW6ӕQ ʅfKt%S ].VJоԎq(Y9ҕ{T ʍv;Šuq+t%hAAE+7CWnS+A$f·Xn+Fq3Šc2)]!]d( ێ~3Jr:uZ %]E;w^8sAqd͉ W\=աrχFA+5 ?v7טpwWo_zܽl^@{?sZ@OLM C;lNq?7J}~WوSሆM~Zoon?n|_=~?a@TS6i$O7)F?M4>% ׬gyij┟xIwlcrͳ\Mo4>)5L LLj cfッ{*9ss/*7G5ٲ#qy`쇥fr*,r5Q~r^RG7K8ȹGVp{sdڷ\dH>;c7+k--ܰPpK @NOBJk6DW/@=.mm<AΑr&l/S|w;76<] ʘ~7tş98g> g>np/KWǡ}K.BIv,+Vҡ3 ѕ:KևOWP*tdlK +VJоQ(@+jwh9 7QWЕ})q(Β׽t/o:n6֞:] JtutLrm/}QhS+AX *Z]p] ܸ+MAA9RrbT 2L¹WƲ7@v'K|ȭE7Vhh;ydϐ!oi!ۄ ] \L,hɟ7YUg}xeqp_z(O~U:= JW_:֑5!1mn6[+uOwH*Fp.m .y͇6dOE[nO,YmVs2H-MU"ـL,Xbp;Zyn*9:A\qRCA" l1r*WYo\ LAʂu9,B)"8Y`]$$7Zd]e5LyqU-ջJ f(W$r,WYRp*v@ˀw\[^H8Y,jwMpuR Tϓ"G Hh_۲ԛz|OcTZӛñ5-朞nj ,&b0)N- -{phق N)Z; YyRF(UThkYMY#ޯ !pKbHb\eX ZR Y|MϙD)k\'ۺ&G7Hi1T9{+WzK$9 \`d1rJUV+TqU*9q 6KU\e\WYNW\ JP H.\7yWR "$ +SuEr-&vEj-޺26XW+Ѳ,Xbpb+9Cw\eB:A\)5pkU \S HaNWZa{o88''5pcPm K`-tkt)&pcaB)m|,X,װRpEj>,D6qea ,ؖ#Z+RL fpelzs,(y=.; xM=Ԫ#᪛J3gtpҦ+wkqE(WY.g*=V27R:A\5/W$/WY.WuqU*5q%HV3HїJvkM1Π`+RaNWؒA ĮH*\eWYpuR|-AqE5c]eB^ҊW'+ V `SrfZP{Vٷ\{>KN$t\)Zs2ew 98C4cȞtҩ^tΠ1{l=KZJ1DOBV%EҀ$uIګpEj=HW/Ǖd5dL3sN#gu+³ZYMl\-p-0[bp)WY}UVڟg +Ԇrpbp[Wz6;}p hp 2cEQ HнTbNWHnKr\\P Q:Jn)J˥(W$X0Q \**%puRpE-W$p[ H5RW'+ ceF܎М3#9r%SѻSJ:֠6߾ra/ߩ8ԭl(ȕn^64r9GAucTlխC)A]ؚY:Vm3hImߎJ%ooBƗ*oGb6jUŖϫH-FǏZ8=j>u-R5qhTd-D钯u|\P\~0{?Φu̎bWɍIr6<ﲟih:ӟrڬ/pP}~Pj{ W$ 5'ZTͨOD WF/CtS )i"$nkS$x*ߑ3넅$HF8)>2WXxw17՝8g _ylpgk'P*-׶,MN8kf ᱘Vm,'w.%&7|A^} 2f>8ZPX}4_b޺7lݕ 7?LG],Mn{uGܔ[8+{iFXZM-=ƝF_[ՖW? 5V6fްX5X$"rm0xE牄=";C{ƾ FDdo1/zbo4"-Ouz$lq/ sZ(D$/Dg}4?h$&61HgjPޒ+JP#:'d=yN8e%8{09{+.ӕeIzUG}n6L=tq]ҝ]3 ]lTݕnVqEWQ/|Z\͋ⴽ[b؀iGF^ǽ6 m| {KzUUK|Uw0cEPARJX-+5%34X3f, % _6Z`% M@˒[#jƢ7^2HɊNw|ϑ&Jh*ȇQ`RB`ssGs|{N@@ \ښk2Ơ 1a0nP`1}'jwY2a6`c ^:`uDF!Vd%۝4k]'nm#>NA Ejd@m$ 9@rQՌVoF`d!:6QR('.Ó~讫<#cgd\'~zsK4Z<K'̛ăb>TN<]+{A/lQ . G̉**Wkj[z 0rB~'/k|M/gȘlXك{3(Od̅l}d{4+U*i >g;-y !&GZ/oM:G$0$ZE#⯫Ǒ}rBnR|leCVu f7eZKȔ]<2>5motfsϺbvYVo;Rƀ&tͳ&i~#ٴvMk?](po̺:s\]rKj=n t"wJy&ӹq٭]t.WCm2}ֹ犮jyqNfw\j\~MteEWV޺J~WI{ܺrwȡ孿}>}_n֗?}U$!0Ѿޱ^`5__ﺄk׼g$7_fC2ǻDtho8i;RM棴i{l^6#iͥr.x42Uګr6al׭w;Ѹ9c&1?2^\NOi5Fgay٧m>S] 5 G38|P:!ڣ :.&`ACR뤕Q!C-2TR_x=>YMX]0xX_ju侬GFҗ'/}~\~~tj=L5K^iȌ24UR'Fބ/ GNF )Z&4DEgp\_*Y phzCWD"qC !o) ZAkNwiՠk s%x)4 cMVO~s=nN#<{trrum#m 37QLFJkz &Ts&J1R:b<t zg ϷF28Oĉ2KIɀNG9 C yd0 WnO* z G&GU-EB 2 !IK82i+mHeMGE"N6@i1H.N!E(eQh ?4*%A:/&m`֋(8dg6vW݉,/PYۓ^\S')EO[.lP )gCDt54I4!*hMABy{N{B8XI+;nN|4~@7a-IlJOJFy~Am<ZTm㿌&?4Ж_2~mn]<~;YIyF&9O_=y=ދ%|(sio'9orLSe^]9O\' cȦOeoD?C&=*ȽGJA^O#:uimd< Hh< ~f&Vsut<0"{/~PR|&qܽ0s0@(%G#8ནN5?>_,Ǣ6ieN}Aci7YD_,zYd B|ur PIIu4 {7i!ׯt<>˘-G@cЯ~Mc~lk蚯['e7>Yg14`]T/4pun |.S.k[E9<|9-nUWq~9k06]jphQm@y޾1_f\ZgCmz>g4Ɏ&ߏ3=,-+tQó_./7yK2j$zcg+Bڛ>G8X.Pg8'{&Yl٠wv[_ `d)Cn %;e0)LDG7rbxfóKKZG@'ld:X*EGAd dA9!K$ v)2)فazV (r@#Tx"))`QrCP<>%J0Mӫz{j]xBXvrlo w[cӂ~G\5})PWI] PKnӊdhGF-@372RW e2O)]]}U\WCg׎S?ýh~I.1g!͵B餬r$cPJre,&GGL˃>)0ޭzB Y;5[G ;Ǭ+NHPe qEQ)'V0pO8cLtpbإՎQ_M|~ %܋'ktŮF+NшL6fD: },#3L%݈OpD!#vBP/}Sh]s* ԒXqvL:9KU2@a}6޸CkpF-Beٞ;$Wy ƝUH6D;'*ZdRȸ7{9,_vQNy;>H+$䤎Gv cE0SDG /=;9YȅGkü}$| l5:5ڔڀo\yOqkjmWgzC_zkFq}]]H0VFW2y'U<%%Ƀ P lzWH;9{y"W)%)#`N8'&D-X; .;ْdbIC(2r(EXfJtD'ͺQi)]^XΊ5rG f+ k?,"Pq9n%O 1(ˁ&b*tF;8crN z=O "G# iUr^/⣎ uGSv8kwV1Wχ;l4J1_rٱ%S)Q`hĂD.~[ NqQ/{ya=b"X*t +mC~~[MXHRiqId C+@ˈ mi3|`JXJȾ WKo| ia,f.d+kII3s7Yvvco4CO* OSuCGPoj[:l` >aMVeݾo0g鬶Cܗ8a8`(A ȚsP%vֱ$d#cJk6 1s,{/W""MrZ .O Ƥ-3hB^-YYҁ : 빱1$SIIψUh}bΩ\"Y犑s?wY R=Ct㍷KU[:+bu[qv7@gO͞4$Cܤ:% :fzgNu;W0prؘPlCc(b8p\;QD>ϼW2P48 h4D#yE@ }Hn ? \WH9Vou9/s1)MR[NƣZp7R*[ԠVBjOpAu[f$5IcwNl9d1j,}+T]jos =}5lz~}vXa6y q1ӞznO=jTR >F;Ah#a dcSF nA28%zM9å]aҙ]F.78]@!dtMS9Nt0.԰±Č'fR ֧*j.L*5(F߽&9;^| +,Jd7q,3$hI!1OZёdI1&wI}4imt!fA~no)0o5i㤃i= RNQo3Hԇ%4uSNӞIR\ `(KVQЀrO!Rc~ht ."h)P$sP)}@jǹ2D&B[$02ҵ1Zkӏ.z)Ưly,hhtT.X<#iHtDx1TԳd DxD&nh環ȅ:[dM1.z<%rV,o3%#S 51*)MB7ZQƑF7%Yq}A]a(dWqW^]$XQb#04HZ.$C +$1y5!CgFev$pP z-apR,xHRd2L "$p P\EEa">( $5FMEodtNj9.IQ2 'Ԙ;@ox c5vkoLC#n2 Vn1e)vR-FO3i45Iyjs?x;IYf{U/}v5հ:\_h yƨN*:d<@><` 'e8W wrdgxfsTgGqh;"zx~FyΝĠS('W0iw!o WEq4S*Oֻ؆*[&Cu|WMlH]}^z'ɓ͏jw*%W׿W?im)8ퟻR :.S{v<=&MȳWisvР隘3v0Jrs dżzsqٲbJ _l2l2[d(!ƍ̻-x4|Sj몌ܼ&^%%|YF3ŕwϗ^P*g#Zp+z /J)y&Ȏoymoշ/ϾB/܁yh,^$^p\Yw_ZRKz]%֕-oY0rv׶ O狷_73 ߭ޭ5o=˖sQ ;]Y]BVd-" fo/P8#YhTFb?G`;o[K#]&'{֥_j,c" A*DZ'n h[ pXxb4hAIC})C(ctPLн)^;x#iH۝GtkE< tGZɶ@<`pZo4օP3) qxg/Z> OB"qB:BFe$Z1qv+YkMd bI1pw2 c8 xMJRDEC(PT@Ӛzx鸷RPX5!R:(M\X,^,, ȹt#7!>\"sa+`vP|[CM*-xǥ@"xt~C }4.s}^0+0#R'YpuDE#a\3#ڼa˸kH!x4CIcCK}$!nJt9cbgc[ϛ_F9;Tzfb[P]||}2ۖOyُ3;{Hy?{Wq _Kⅳvb#adK j-)j JZj(m3驪iKu}۟YDq3/?|4iՖ'wy4) (N.Z4$Y0YH]gիޫ[]A˽G{$з>ݶP^[lMrZ^$ұףص sowu_Gg~.ŏG~jqwL7?ЛEϊCz*Gތ=_xWAwI!RԳC$8{'q<-??4j'c}96Os~"圿7C瘼C&hṶw6KHJ꽰r4TBjVGoNn׹w)-๺^b;ۊo^Hv"&.MK/07VA$'A^#s :ɢɂm:ծm nni=j0YRQ֖$wKl:gLZI笑9սnokg ?ᮛkN{i%t\#f9\;+i[PB=mj|~aFŇB;ΓX5w `֍S71EjhJՕJZ#*evFtS.4gaH&Ǭ EGEtF3+C&Ktn=VY.;ːh8KLhrIJKn ydɽ.{6窻Y${~p>%a}֣IS)R?9S*Dcϒg W\Aթ"›t, *[I+^*ґG@FbTZ:a6vlIɽn X]=LO6=Qמ([[SqF[NNیⰂo^r~+9,T{v^ AjLJ&Z!JQ{`鐭|b"aSϨ)X&0AV]c9' \䤺g.VvBѱ/n.%*6=/7ܳ6%ѓI&'vBŠ?~(/=0f y`:qiPF3(RQ<ʐ 4 D!Q:$dK&#\8s̢{yc0@]ڝiǦ^[ve5OsV-:G]6&L !'34L9 NaV@"CDC$dQ 6j 4 ;LF}3q-_Ǹ+3#BG>7-&8c+kgML,u胱yϤ3viQd|d@gB ѶEH1EzX:wH:_4sڥδdS:BTA), PJ4HIr"hϠЉc8ytc~qWagڱ?C?܂ [R}df;|)o46 Ip55^!eCRFi6xq&MP6\,Z@:Ǥq"1 1iM奨CR;\#rt]R m8ǎ7gR|$tbK0oґ㤙}m.NU/nxX"sw4hLg.$*q,P4 Yje>9|&U7oo7R#kw"'"] .rRZC 8Y)Rl޴hŵRHĮՖ_`Kuj wU/Hw]) KtWRؕ0:)]Ts;ZVWe^K:YA^m}/) P6@l] s6w]B^5r<4\*oYc{àGޟ}u4uro.ȉzm:3UZ3O=1$QcجuJQqC_ #c@X uBg'鋖B9 ĵ)<+d_굿_f2y5ߦyj \ppJXf+Ac%h4VJ -@JX +Ac%h4VJX +Ac%hE˕4VJX +Ac%hzJX +Ac%h4VJX _4VJX +Ad?G-x6 t4lTCi7 6LkhfXLcKUHմm=#*vپ/x]M_nRk8؜G/[ `^^FH26r6$n'A̅ƒ@o%rzuIWSi1l׊cJ9ץI$>;|1v l&G& 9Lى T9i : 9`"pu6uw{=Wנ'>dc}m0/aa[`Q=_3ย5q53 pFRcnlR 69m$V؝"dY=mSkFo1'&/KVsXV1!!n_pn-aq>B>u*c*Y@`s/E'5tRrS)HC*x0< ǔTdC\rld@\U'8=?ەgĆdǵSo\醾oCo[.ꮠe@${3sеl(8Zv+]&HRXHFrZ'VQ>,qԮ e"PٙU2'+0s !3@!A ݂B="% &(p)B9Ck]ys"duTg<e-g$} Ff~֗a Z~8Kh\OiTpa/x;#ſV{NOϮm$ݵ`ru ]IsZ+ޕjU(5k`~]ٽe{nScv7wެg /9<蟼ͭtWsAB6|ڦy50f$ڑ@=aa,oG,XVpp՘3f?&׍^5gjIf…_GX.}>!{e'htˏEȚj҉.b~wo71_ZuD30X׉ EǓ oo~zЊw54ZZxж˻O#զ[^3m)kkk@R7m Z֣lZ5 KU0+$6;RҋKRTF]4.R Y w-'1H6]Vq/'%Oa9aZJLC&ѻĂ4hmށv뜴'=5;!ȼgف%jL\ MFX{NTNg:;L7LVԝJqe0 lܛE\7-K`{)=Òk'v2 '$K*dS*@R#&L j~ ؝Íd"rKtddP&),[|ܥ#~x}T@Oy!b s(il “Y#K ERs=m|HޠعL΄ƕvL5@8t~nrm[G?}6??5eee4L3/ިw@n}-s[Vl8.o-z%CZWo4AٟVvqJ T3aq`9`yg|ѳA}e|2Q1^GSnJ8Ft9pƜx(y&o4=Vei")3XK!mhJȠ8Dp1qG&EUwCf c7;?8O3v4u{49oۚ|}0 N#xctHʾrNCi*o#;CoAw'tv5e8dKk :7PfmWo7B̿_~҅fc$7MRQLW!9Q=YE:8JzϤT P9r+ ZEl#,r}mu2" Z$6ٗG#/0lo[tJ#+ϲɴ6(&a DΦ.9UITJ2ժ O^(-HzGK%o iCI~S;E_ ׊5pyuVP {O%-jG~i"X> NYK*XK\N~zϮ/V?Mg lyf(42 iuق &#R)ex4Tyh3F.xhCy4Ѹ{D㆟>tqjgJ{{ֽ\ɕNvX.U1HM,\20'=0Bh4wpf3Tyz>erTu_zvwog䷴yeaft b1"U9YQ}$*Or ċ%_=e:zq 3B¯m>೒)L?dY)2J\(5J+tևeL9y#YӐPQjclGst:X LW,,@E5gId%x"g8kuȖіr -Hr!H)Gďż0Y!H͢ ^`&(-"ch]`y<ʭ+r .D2R"-%FzΜp6d/B- ^m0^Rʽ;N xE=<$cJ !;3mf+oMHYXK]: ucu0 R8s=mFy7ͣ[Q]hpHa\ƄVșv9CǤR)bN66;mJ@Wg 0(.Fs!z9@!:BM7mL67N=ŴfEOEtK3$уQ3V;j3=|񯌝15ˤ-=]tJN<Nǣ8ߠJ2)}lNI 0(g2BVNZztc(a?jp@gI:3a0N;2h{b55Â}5ժPCnRI/8&<LH|Do$_@)X IFs1s)j!}v;o1C()'(V!zBȆPUn=]O|dJ6݃]G2S}ȁ!6R%!1/es˳È) *f"Cޔx{~pLq8یڟƏZoo//^.ENR(zKEo,fcd Ak+h-A,BY˳e0aiy{ uǢ5Oua>sqO{;?;a0:=8#  )Nj;I`2^$1iJ,"nM^6w>:QL)y4A~_'0H4~fLFZS.(}-?h):]|mw2eMWU U` &;zTyϴy߿'u[3s~ 'L ey%}i7]Ai iCHRaVUtNN^B҈:A>rEǩ2q;.7~gI3Mƻ ҹ\E_B#ο^Jw'ux2}Gוh [[D8gkRվdR@,:մsk3*#c ¦*6{ŲJ]y䕺R[íbk׿J}/ sYq2 :CIb%P`P8 h>(N 9'>gR$MI!I)#JIP(L3Fv[rF1-p4{j9==Nf,T RJ4F!⣐ F\>P Fc5#MGL${^)򑆕o׻M2,ebyӌ3z< JǨu)}خAѠzؼ^,MM͐}ӎj4? /cMd(L$!u6y&g;<^;$qiz0.aT^_6%7iQ %3ȖXR?H#u.ywy=|gAGgGT -`m!*CSi<l/9`QmGs+:2,"1rXREbD Y4n>w)bG|-m:bGwmx-fzlPl}|H20z0HeK4@u3Kfn\cgPcUASUyndi [,sH׎8'! {k,X샥 oY`#,D`mH1.S[ %2 r´,sh)3QLL9CsY":Oy^CgJ€g;ȧ'u*3 |(<=Rӣp3~30]8s>=H` zWĕ Ӎ?zL?8JC )8_ χ,'S/c,:PiHS7g=bOw&Q\o.P|cs:< E fci0-GEM.yΏ^`> X"$`*IKUB8UF8 %&W D;\I-\}p%8%|,K=N2u(0da:I)Q _!Lkµ>$#X$.gWIZ/Ȗ#W޳Op> ԹQ#X4:ۍۋ*Gؿ8ze&?x$%p;\˼^] އ7&Ch?dC/j7;GF{_2Ahg CɧQeYI>[ 3v0gE=,A+ŝ_L@sj<2*IXܻU=]͔ ('\IUC4 ܻ;@ u3.9ѝIz8mQ3'0)ljyu|7'eDO*7\2,bw|jntۂf66Oèux >LL;mT0޼6wwJ>{l|;/"k{MZ_^\= r+.d[e^c<4Lz!AKB " \AZDxDHs-0 1hWF&HTc0qe>B+ `O1#\",Yy$=AZ3E4^J Wd XZ t;Jb4mELE9Y10ywN"un~÷hO3Ԍ$D#unĈ)U +ETRĀ(bYEfg0\zBO<_xl42Niq8Gb$"mbdHCL)^ R( bҀp< OҼX~B*8yf8x'2c8} >1ֈ+!3/ _jē׍nw'j/..ͫPz׿uvʫv9;EVqϾMw{feg\BTL jrNFΊuMo?2Ԯ-i`L]|\rQ:q0O$ǃɻnwT'tf$?H=|/sכ`]?E;UJK 1LNI$1/v0:58{?t78HN~Ah0A/]MtNAS\KO` ?O*74"DLPeLEq>v}qzZ|&F - dlJ$ {uWQ!rp|֜ 5~Wuz׸uMb-p`'.C֓}JOL/M#ήROAg8 C8oz5\o\W~ ,hmɊdM ocs>/rz$wuOqH_鸗{(GJJM>ygo6beg6HI)gh/_t,)?~Je:zr&/K6}Yqu .Jm.0x(uốkKݨu0ߍJ`o Zv9cq'<ATJEQBTk=E rC$kƞV^x"i-{U4ޚ+f.O§eE-n]¹_q!2MZpJ6@N>I\ATkGER6+kqQ}-Q*j^0C7\v g z]?H_wxc=הpW9]khs}Z \^t݋A\1\M\&]E *zm=RmLbWVE+IW-P$SYL9R$>]IEZY3)tۑE(uN'Y>bSZQ2"O/n ;};z-X7(1ir귴)[*MU,Y5 cDHu $Z≏y`X6%e8U4^a(:%Te;}a欔%ዐ 1"kUl8T>8,՚c)Y JYɤV1@|D FÅ+:P<ȺEEr`no\jbhljF':SEAv"g!$O*Q$c[pmu7ޓg m1L@ĠOeY/ʦ];ޔ-B%8}gmR(NU*.{L^ܧk^'mO-7?CC%wAWVlc*RQG\9&(/ɂ|9QTR]*h_z٨Iqd־7w{#~8i=c;__-a55}ng@P`p )9I?|PrHpLd:rVw.ꜢV`~:Tυ8ޞϽ-@Zܑ\ S:8\b69Lt.@ V 'ˈ{AzH0P:k쎺@x<1J | tM4/=UV{;PIuWtT5R1n;tƦHFQ Rl\ŦgβiQ, j4ojPH!s<UWH%"ZpPR}@ 6@LQVGi/""]s׊Bχ"|Ⱦgz8n$ =D0[Hm o1֊f0Gc]>=4 M: LLJR$t[6ݧ޾_ N c3%BT/1E_JF-QoT:Ey鲷{ʗcr*]Q`Pa,5X2GXN|,;K :iB:[ HN>SSU0acș RZ=hӱmNw,ˍ'ҥ/:LmY#R'FL:V)_-%#B,xbd<DdhPj 刡a$/޺&gAܗ9(o,]XTr'C3N!]*k}F4&ś-k:vda AQ:3zMTքZ)9WΣ1!hQD)(H >[VF_ȲP*U]=zN#\Zzx5$j."yOϠԎv;G}]Ø,;Ol:,XxvQ2m%?ZOj⮗ aF=%w]Vjy;ki;a˹:iW@ON'o1y掇z.6-/O콰~~xh_ݿˏw7?~/VO20"A~~, x6>[Cϛ>} -㭇|Մ?`\|VyNƶ.ڷ.~V2/᮳mojW|u1L)rUEހwt)Ezu@el&\j )ZTUjaMs(Y@z#KdL+hH9P ZhRON슯}ŶC= SؘU7{Lσ'!ĄEWBtɛjdN*pt%`RbbHӋN\nb(dhWZ8{us,S:OprCY8/ߋx(=lbU6u@wJ팊j넱#:^Ϧ,8,̍ŸO>N/O.g +b&6vF^?w}w>{=|k]b,Z R"+Me$`9XbŎgq6xP´W}i6 m؆6<:O/B'2{M.f*?E(ߖI^hAM_?gfg;G}[l][X>5TynoνDmO=>>=5ض\{F^3˖>;툴 iE >Һ([?~;G5PyvV&5ǀ:hoT1eْAk .`MZ) ՓN-Qҳ(jr\Uѥ9&&XD-YJMdk*wȹ?z$:tXgN#E/Yl߾kwe`66}ͅ"SUI "朩5E¤D}9 LUS$+d"f?j[~AO>н {ýقoGyU41=yGݲ]. .vG?q5XU8::*|^|@Wz dbElGG-L@aZU[1ZXr? kHoG;Ӏf2˯ : ]n\Jz5mױ, Tc|QYB :lby%B.^.:CqE`KnvOs\ :$Wړ$k*\k"D1QQ2d#S Q UBɡFKmSS *TUJ U TJK-Y5!9B9#+}H~0nl6OD2$%[YC#),s=տztuZ>"aúGEW?UÎ5[{L6D.9 I##0(h٨M8RfxȳnPJDv@.(q-8tfaHB9Uaay*5˼QӢջTϤZF\OQl|e07񶧌]li˷:LxrRUXYf NK)WO<~QŅ4`3*p@("DhLT AT,AzkǍG#QGdVdJF(yRv N9ťoW12$LT@*72f#g32Uaa. 3½bጢ4MҌWp+7;㻱\WO~ݠ4WWW3G7G︢-bc Cc*LNL, g1/bSE#iHƞ֒%{* %^F..Dʌٌv &池v6ifԦjw:'"#Q6;BSZN'eE$,z"Hkl$sY0r!c2hh@ rǐ2>1e:HTu6r6aeOc0 "fmeFD!bkKtm,1"R J:b*Rf D0V*+'DeED0BB2g擘zHL8cu% 9A}Kn19-Xj 8.uLl\-.̸;\pqm< ,]{aLF%Ldˠ 釔B:L5Ϝ|qǶx(2aߍ'Ẓ1Lr&aUd .@я j2)}m̷uta*zz0ۉ` ~=ep3 K3uQ; <;eQ,VWR3}*첎/<{՝ ! 6 ٲ4xm-IMˋ x6†,^U~Y)1Ee h/.iQ0|gr3k^_w7ozE+t3|q^7wьtcjB6_aC&X{*Jds~㧘#yj HV2EBSB.J{Q}?*Hoj#k\ nWTpW #z@?#BXYL1тFQ4`pH~U1~]>tlw nxi:eiOůqc,c05%ܥT%gQm:܌9x98;9nFLqSh;ˁ<.\c0ƽ$ۙUx J7 2$-J E`$+!w %fANV|% J)e4AWH4*θ˹L%!8kMx)&&D]&*xًƁ+hTBr_4 >ZN[%EqK1 }QaZ, ́˺ eUl- JT@Rq?&ϊmMe1I G|vAw9k)¢gQ{qx/Wc^t,0jm)3`ܛ;jug0b=`R:s39~s{v\2_Ο6UO\W13M_.IbdޥJѻ`0yL|s<ղotVkR,anǚW>C3J2_{M jd._rf%PE-j$"z/ҵdj缳d~Ý5m%5. (,_:"<Ǥ>ޛN*vhM9$mȴIDMPMR &~Rohuv9 3^-^m$Ha[L^b B2e>w6 N[ټŨ㿯+)O: ZΏBRwek 3}Rp qz2pȕT H;\%*tpupE[.==G\arxzЗןD.aivzW pE;v@D$'W\EOZ;\@"\H5gЫHJdf-f4Nl +RR/ w)a_[ޫM[%wDR@K&-DZyf4īҚWWatAug; V˅+:܆Yb!3G1-X29@,gH2EზN1%x }j.rO˧S)wO"2}b>'hGkꔤWW+ xu:j'J+KZQM*o)BX$^BLHZ*ЦJ$8QL-Q~ l) -|DmUore:ߡh#&$6#o0F[<)yfk }g\6r6;.?"_G@r_ GۆîotvMzfYRJ=#mLD>Gڋ;8R΃(#r*&VZL,ssBQ908``1`-Xhm 4wF-OT &&# j`SEL;xt8ϟg:hkr:Dc<{Uc;SLm?ּxlފ'Z +6Eec+:0%Ŝ1߲a9R`4YTؐ!0U{Y{qg9>"rIX[Ғr4 GrmdX &cIOH W*M`8p$$]VS`@(:o;cƌL"^ˈx}˭-Yd;b:y8(09i[ 8?U8 ,i# kǑR3Z,;YήrOA^ won*/n<R=*rf<T@p/I8]$ VpREOoFƇI)/ TT,DPioCh_s'*cm[ʌÛZA+/&AI ׬P-vKE6ͺ}ޛIC4@VP,JWU*hJFoJEE6ӑlFCnjFq?<ـy}|*]}=5IJ/mB+ `O1#\"ϲ,Rg}*?X{))qJ7bPW$&Ƒ0,G꓇cWwU-6(4/X3oa4?Hyn$D#S][sn+[y/KJ^IIU|T*t!mD/i^,[.BjGԸrf83 ?`>e\YVMd*ʡdxw4g -M#:-~, TWFWw`g׶39컶z-_vqj~aElJٲmǬ,O1{sq1_2B4@p+(2]Y)| YD'x:^XSf5 uvW-눲ˋ틮o-/9 ^QOۗ0}MJ^s3yؗVvʙ37ʗ umlm|n9QeG<]~*?Kb1Q4`@{!ګ9t#V2kZCZ!nr<wH~<C>׆S<~9gNΕBD7&~rS|]*N/m9W `&jQ:kqM(PC-0;O.!qq:@* 'u.~\-ku-+&f-JN$.fLȤYpA1zM #VsdZ]Dv+e}^玸p+.qQj"nǓT2"Y`Ê. Ve8MjTΩB!N|d}!Ĩ"/*7l[ֽok,OPox{B!'%,N}胷!IA<5$7e%"?`WAQ_0jj6kYs6J*d_ ZZ[>\Be7kWkYP~ />~~> Zb.70yeX5gg7uE![Z䒷{OYo˚q۲kxO/ql2&Um-ֻw+E~%)v]z%6Ajw$PHNjEa,|%Mq})uQ\Vm" Af l@ƈ1[몧j\ X3[j+݆cX64eA !lcRX(SJ(h@礁4eaEM5۪/o6uN̡uXlJˁCJ 3pŨNQ$C޼nyYZl(gC8q1L8 LO: 3ALqBCjB4n n (41B; !DSwXn=Q'6GoW0x[*A``8@QDykdkGGN#j} ⌉ipոf%P1_;[-)i\)}!"NEеGsZ[S1'1THZ A] wÎkc@n/s[]+GPrFy@rDmQ,^TY,+ `imd0$>TXTfBu&#l+I'=;( $m`k8Zqd?fUoX 1DUP1`Ii#ZH2D z5Ds g݆- PMA{u +p֛`>-*Q hAZ]plr KlT Rmw,Zd8 ,p8YWChbsE'Q˓ؑ1OP*Gc;Iϋ]o8pmZcr@6,v#F^fbKF$ t` q2} Ǩ w= uK F,6(N}\EycT؇(Or \`Ep>}r)|^O…D.CwpiL4 YK&n=i#iX .U4TUEɴp$L.[-XsT!rxKk:O]i g5hZJR25XCrBo4@Wq9F]R@@mAt%&>G֚D2@Vg6UH %Z*ρHQR(ZXRPN(&ey ej y3,aqŐ\mRYpLktA\a:9ю_vq^H)O>ò=)0L9CONv!x^6dA$m=~#PzBb5Xkxh.Wx5yOWeecAwrY;^oțnJWy)v^UUgDWATJx0ȫHV"1oU6q hBԛû:T]_:k"ûqm +%@x5]kyK+({s6q~qaqyW{ zvwP5ol۽}';6GŦRrttڷ' w)_W`jxye]DۺNH뙐'N?Һ6)wn~=KXCT>PV&ȷqM`1`+}!XbD/:3?^g a)q^Qгb)[Q"Bh\BW5;)o']`}7#[Ҩ|)b5{ͽN37O߈FՍg#4 e.C=T AV4i30989:/*|[r[bX=b- |t_5empmXvOPW`Hv~ H+hGCY"p6ӚR: Pr &hDFZ˅ LY'TV*L!RR1eL% Q0kQUQL%UB_rhkMT1Gj fwbR +X 2JتvvΎl{ME_Oǰk @ʕ]pi Is,*. jN*jwmI_!BCɞl ЏjґTzdkF1,3JJ 5dkneefٴNi7mτ `ܣYgQr.v`|e08{*)gg>(i^W>x2_Q{,z-|̒Vj"c\\KŚHɬS` !%UMM.IdS ~qU& T3WP܍tZڃqǶFmG{+kUKTAIS" b'lљadVK.4aJ/:kvȤXL0Q R[%!8е#ngx0_t,60"F=?0:pTGd[\obZn h*Re?I R;p=.pqWp0i`< = @qLl~1_I#Նp_p${GͤeEy }g~tJD?b-ZSŏOg܈]zw߾Ջev˫7qb)&;>9]<`-œ)ZGWV맥v8*䢣emQ 3}]dߋe|/VA  yÀıD\IBW^@SuذBy_~}UgČ?\]Z_-qYϦ%b-s7ghT{#q΍Z~y07.NgR[.`5k8/r'v-)/줮zrts!6ekB.5lJaNrXkt}oFg;AOpF6vN\<û01\Hݍlr:P YHu``kJ-C ]?wNV!ZEA>s>]PKTAOT][cdm*WU98l<Ɠ󟙜ԃ[ݎ$\ (4M+7|eXdR N 6iKWL5K-uYg*I . 8ٮ'vo} 3n cqwWսw>EBD-٤Lw /i?N+b4Ѹh{YϣQӨ{+Z^v2r歮eVPp9U Xh&EawjQMmT6t_l6 P:jgl[ژ: >3R'ap̪`OP1R\苾C栊1^||>oc>RkCn\5rHpoKMYrm ]0'{<.=Oyy|ޫxJ5 +ZD%Fȶ@֒\cC ezӼ݂߮diki$ nY`K^e .(ocѐ9ai5 6p*ʢ&@$ˁ8%sIBaIZIs󁚲Gj?i?csоw.B(OA9nt>m$=/Ft\y@ѿS ,fv0sŧG +ݙKo>!/wyZweE.q}-Hg%n+.u>9㋃ xra YΪ9]V-к,ɵ†r.dqO|)fuz QMO|1:V +j:$:<|89{yƬ54/|07Mj'NΡ+> KSލiuhy{;^cWt<_|X}q)1/7Ĝ_'zJ\rNFvw~9-_MX[z-Z^] eT_e.d>ʫGtT#I Æ{Gu{ |[}ă$~~R0!(i@me[B.9"ZYWj!{?>pXC޸Îx}QGk`O&訙H_ ^gw۝gѤ7<ͪ} 4sP4ϦBE#WRQkĮWhTZPPZ˭W wɍ+V<Gs͛s)HX z(?_f }īرcFe P6u巷#3ì0Ƚ);ӍZva^|SL |uWZupG?ziS[ߒX:ƥU_&m@d Pl{k@vtz}]|\ϨurBn_i :<;x8]=u{%2MmsV(be:Ng1O={ {`OV4p_5}!Y%WqԊkGٌ`Nt5zrPT K*o_{edv\ECވ?yi~&˷gW1\i)pTWʒ.ʢ iX&M2䣤,6};mvc/Ūá5x>vx28qۛV?6Ǔi_HFR }Mt'RVnMQXm JYlK-%RLFz֢$PƬEh*bJEU++EDڷҶJqboҙ)bWQ"`U~GH%i 9GQԚxAyCCRZCse`.%JfiLk'J J]5Šc\RK-y(#ӓ)L>^\R gA-[vYB `-m[09߃"0cR$Ͱ16:eUkI]sMC)FA8j=;8ZM0᫾<3eYZH`V6G+Qܤ CdJ[i$ :Z9TK}efWsmὐ#Hc`̖}g?8f>eᝏYlkq$:$pyJ-L?[ e"f/ IUNpT9ruJ`en WL--{gA]ٷJ?$\Ս8H މ26s9ZWPk$_GlH"QZZ!bzn$$IAb9koDqˌ54JTA,\ J{5VJR@rc@#peӜ0]*Պ#Kv)IVa#2*,aZGRIq EVbkQ֐9 6*l;oqxL ˅^` hẺJp,-USd huu#:(ay|^]SV}֕ ]"XP&:"0Fsx.@1x7$j * +F ʌ  7J/yidX{ ʤY(n}~ V& _tW=|,dLzÀzX^s(`vm%d%Y t3L5XobE\[ |TvEq]TB`}Bx<IrC T& %"B.|*4#AH 3")CX4^G %Պ6 PqD8EHz m|D(#%6pЖ 5fPcS Q!)cQ/E>łx@#6wPL1sDCl=7aw+k2Y3},\)2L@Mu|}ȟ@&Fo1w*rq<sB70i-tOC 6'[A@^;82a`9M̀bHx]lO{Ĕ7D;s|dh85(g{mu~\X":rco0MoBF҃!FR,An*z $akҀt&+n Bޅ1"%h!TCV/|: 89u R?{d@n4oQE9:(vOW囻׆_/cqg/)vq ^=ӍyG/s} V*oX~{>?;vS]6~۟?N˿36e2{wؕ:8Zqp!bN l+6KMt!P~?`[7 h$E]Ѻ]%])ܙC)ub'>ieՙIsd}^k)zd܅tc[]p}jّZ.Wr١Q7cWnU p'{[ Z`E.9^Ouͯ3JkZc~Pv. F Xml~*MǟUOo`2ŷL;ejpLehl[ -)[6_ \1b!I[L5ʡ,W,)b6KؗԿWEqB\V]|qo@=:Xe*b+v%1 ztFqB\9%],'\g\Zj>Jo5Z#,W,)m4+VjJnc|)Xp]%aJmURV\W1{:PȤO9v[g٩ˁrhrJ0 ʉ*YUڏ=rL PL K!A`rJ\Z,U|bccL_bX$8C.{Ԧ-SK\EsR+P&/WP? q*]P\W.9m+ &HuGW[ q \GRpjɷ+Vj ) +L/\d jɘq*NCU\WMJAN=w:Xedp,ie]\/&d|t*FWkUlmB9 t\Pc;/ވģU'Vjx [x -* d),7Ia"ޤE5l, Zr.hY0dtZI 7)'ckV뛟Z`AgB8;;IYppr-MǬsMWR8Z+FX+VsbEn>\'V}:ƽj`X$w{ Zh2X2*)[PN 8(W,׋u\ʬZ#)@2Lpbpr+`JG*&[$^Ա/\AnCI2иZ(Z6+PecIʊVT+\\")b{jb)+V+狷E 89cmWP[R\ |. \A+ FL2j}hWj !p"a>X.y)b)+F5⊈Mpł+uH u\W+U썤+\l jiWq $ WLFN2rm+V_d^W׉=wnM%:2 H;w/*wZbh>eK^6UL-ޠx ,_"/ZdQwO֢vj^S#>E5.{КsѢJA/ {x.Z`L-\9 VVuN-gjD2  Y ej]YOWU_MLk0 ."ҵL?.e*uWEqܪpeΉ j#q)+ֈ+hq!Y1bH"jW qSFh-WU 71b+Vj پBpəbA]W+a4-pE9"'\ F X^2ktF\E*9 ۃLnRpjSԩU@[ ]b;[Ld:վJ\%gC]A0E#W,7{)hCb^U*Ӧ</P]&rK[M_`1nXZ5'u:Ka:,d+Kd\F FZu,#gO|9tl 9ʇ-K[zR,kƥحjDx wcJ5C̒|wj+}򩫊oO6\sFӛwi4`fg}o$o^͋~8ܼ~~}vݏ~" ]f_?y=3|u1n7_s_ޔa3}yog?W3mW#Z#ymk m}mۿ拦.p/z5r/ՙ<{@sSb}GT{O7xHp/-Pr{]Q1˩\ ﺟñ):'&i avX99'O/璛B6Pc6ߡVVi, V`޿]s;B3} /7?1w߾7m>z}~#-3lZ?o|qvv/-U{~?~b|9h)4 =:)o?歏lvWpԷG?oNP_/Iaq6ŗ`d/6E{ d֍,9<3`[˶dkXYdbXⲡ\/,\'TT {[ْ583pƈXlr|$w< >q%VSjvQt}&ȿ5 dD|zErvJ+dF&vX ʓsAh&l=FX:#gK9k f-NcT9diB|,>WDYd!2$*>"Ę ɌJADSx\Qo%p`X`)18TF?4&:?v%Î8jutO~Yl432#iTHe;V_MבOBXD*{*GEKL5c]`443Aqt8&4xhW )C\T̴Bpaf=V"ٳŀ#mu|<~oc%ck0PPXy R`3 R=H[wrL]/z/*|cSsB9Qd̍{S)m2`}Ͷf)cf葭ǥn(' cUB[eTd@Gmפ $͆ᮎhexN:@L: eQ H[XHQ>N0uFxڣ58{L3/[\Lxn߆;ȖmG?ޮ'pgxBiLͼ6_7xIwIhS44nf6-Yo癠2o-LljRVj/Sa ׸kSqk?j88zV{Ծ:天&sg=|=^2u W ^/w/?Wm@ܭ ugr Awt-5Bs_kgG^iD7^h͇@/R'싗#-_b;%z2yK&") ."\&B( 1iqN"T 88`wG LǶF%g] 芜-6utzh/|V5N.1B΂0IzvP&U0hc, }b|B vl/܂C6',)X$pݛ`R`֘$!6U؇OǖqOiH `pº +Zv]`72`M56[X$GC?iF M ³wA+%Rd27t :EAVv yH(n?G2E -T*5,h=k?LJքJ2fk/dx z? 4)v}[^,Ġzݻ܅ 8ioͽBcbMS~x8oCU&?OjamwC ;m̥-ߒBp^fui/F}V53p$=;5(#х8:u&GiEfMKglBhJuƘOfن~l~*8K I[-1?,lnuە ˪ȇp !6d=oFm`c7 bQ[k]C]Q-m?-mk+@Zta?I#|X[kW=b˵d_AWUˋ0,s.ҼܡʙksB֠ <7a!uyRwD;yxӍx d'se$IøQWS$$ɁR ![f;lcTǍSo`:~Qk]r#$Q {S~l2׻mV woM~n1|;n@u>w `==֫7ilɝD{~ ֽrKZm@C;3!--7OnWw㏿.v &jB9z\>I2;U~@ $ DV$)INd+KvGS%JJW,X%-M'JE۔38*"\*Ȑ80ZY `pI 1u]3rNkΉw 5ucV [dTγ Dۿ.;Qo?` P\"Go2"=RZm;1"*CI΃IX$:}ގw޾ٝh >A|n|{o˿f_!QLU :DU*;@th'-1my٥f6K7$09oiqSAِ UdAEG2֋_fH/a ɂ3S҆]f0H1d2] pEgAHktK_,}/`k~¨ysr_fGl onߗն[XgLద>h0Ϡưs46uR24@6(l)'gݕnu{qDhQ3R@1;!bH'HD(J>$W0ewDHdMQ/Q u{=U*"d)0kn'{U>'IOM'!+x$sb9HaLaeMhY4DRH7MJ j`,%$2k[ d";&GO jg<,tZ#͘a)kEj_t';=1=1rפ a[owXŝyrƁ~Zh` ˩w?ZyXA[IBk*.yl`ɢZzd0 B$L שfޕKF%Lw_!SˬQ"`$$;}d-HIgOǐ,}pZ͚*$X1HTMR8ۑW)vƾX c!Xx'QMf 9p{Jχ_ p8f7р \/(' Ae Sds"6<.C6l@@p!p'!(fJhۡ]F&JFblGl75-]lwڼ0j;i3RLiN$ xV* Dd`QaUaRYﴠ22C hE{J"EDs#Ðr340g;F*D6c[0"B"n,kĘrF))}r: G)uJ{4XZb("M 68ES,eqy]hQţ-i/ܢ.VCUGgR,KEQE.nNQ8!bNQ&YyZcM)c\ 1$%;W.c1)ǼN S'DFH>NX6Pvh(Xj#jkog:EKbLx-*Utblgyx>\Nm3j**W+| I4B^Fz=@%$>!qY<-eYG޼(pOfHB>jy })9/13O_=dQ;:B7o)"$ӕ\l[,ʽ ˨Z52MYu { 2OC*w3E o)?4nj"ۑvħ@m?^lWA֎S>p/Ff9쥎t&65γ~֨nSؚ^ GGWћͶLz7f_b8pWGV8rթhi7-o|R{Fnٕʚ*bn;;6GlDJNȉAy?#|b*rR)9l01RQCM*ȂP9DJ}(+@c08mfσ:eJ١҄E._/xK*FOC0H5^*b1* ͸LҒ`PLy/?4C =i!ŞӞ!5S]fުwe`t1{LTL"EoWn&$y˅Pc?q4x7iavw}L@x|yǷ31^E\0ʋ !܏coWqn=_Oѻ7~\ԊW*PĆJ A PG4u7}_p"5\S/I(H#| ͤy~cYԶ^/ާ_F?R;vhs"jb<a<<< _ߢ"*gۛxpX6n)/^<by7,šUt`F,| V'A(PWڙuOaUsI j!1(j5JC g.X`u-cAVO_ߑý`wo;uٮG]JPJwɴJv*-+QZx[Vf)u-+aJ4ߘ8%B'WY\MNPZ pdwW* ,OdgiMR!\O@d*Jˈh;\|m tp~JUN R \eq<Ҷ]!\INՎK0'WY\W("pdwWJ*  NdJy f)=•ܟk\ÀVeq-pX` `MNj89ćQQ|ٻ8W#?C(O0E*$e[ %4;悂ݝ9]uꜞޞ#+B$w?(kRp^ MWDJ>KTȮ7pu5گ=!t(a_Lg? < {K {<@W@Wz1np5ZHzte%Q ՀKhy@)@W/{++vWCW@uj @W/5ҕ2VCW] ] jK+ bš jj] _{j HWPj pkvJzt1Lzw}zYk6~tkc[iwu& ?8oX}[=M_\]EMu+N_G|O_1oޝ!. <|>i'5W&U'NDLחov]q[!6nyt 1om:y}]nwѤ~bӿwyS5Z&OXo̽o}݈E?_]f#0ng Ez/WB !UBviW*9O6; :E2l$} {; ܎&AMORLAW[x(!6= {&LBP?yuk>>dK(X| 㼥ٯtijכ\hτrYXufV^] "kɾ@o_:7g]]z5ưڸjT=_Cd/<;]Xs_g nxo֙gPʞ+ށ@Wzk$z"Fj pZjNt5PՁ ]>gr7NBWmNW@ҕ VDWՀWCWmNW@頮^"]AV++& י@}ێ@W߄ceEt5{p3dhj$> +oeM jjQW@t S> +޽e{NLGNp82It?V9[lrMB@J>ʗHtEjju1m{U .1uJ8rl74]^o\yfuگxvCo&Xv+9գ޳']][]+}.u`@W/~3i*V`Gcnӳb0+=i-ަW߫WnbuyO\i?l5"kpw{w2utA@o^ǯtvvQugC#w7cތ6ǷѸt6]]_\"X78'o6MV?9Wz9 w>qp:棔~ħu{aMQ{r%w#yM6o~Kw܋߆ʶ/>,,@Ey"28 E 4whb0քGx`( b_uh<ߤ_۳T?60B(8 ,sRrX\cO/Ss.>7g!ȪB޴ٚJjݫx4bMLw$#d }Ò|X"iGZ$B$J%((E?Es€bkG]q[ xKA"Cfj=Y7v ^2 DxѺXbO P5SX sSb0f;Dq΂5.E{ ubQ =56TnG0L&yˌWh|<6(`%Ϩ-olګ {g]SqWAD~0P$PyUuѕ8Yɠ-l^T@8քFnuѕ<P\ZA64ƺz?s%Hs -+2vYkԣ(-հ֞>#*T-JP_;wMBAIUBRO9\b ~I֠ke j$`Wڤl`W` \eKYB 8(57C:6n g2 +$_X:p6PLf*znB AvEVB@LUP!ՠB+~1l a@0$Xk@YPњ=4v!ץ;-ʨ!n3|kփ Ag"g#nC7!wL ҍ)iKI!0TT\(lL;9Yi{K 6ݙ(J8E5n,\ Bˢ=+ƃ# 2|6T2 w2J&R2#EUDI)bҸBϨ`-hYm ,y&*Q8Pl PeٺAoeFn Ck>84i^g5~! 嘿ܠEŌ41c%814p U$6oʐvH'"!pBDEM`S;L<1wWrӚSz{:ܪήd=2+]`AZICL-A7 /*}rcV2=$]IVp2:bPʻ:YIC.z*#Ⱦ1 ]{ #EKzuI>m xC*Kt]/sI3w51 j3`ҢUB3m>%8Z\1C;V@rNBh5C΁!BE5&ň޲㎠%߳(<`IȚB܌9v`Qm,BgQIȔfWbd~ȃR!*888"qj+<\>ÆEeXI1#,%flc'J,E|hot X3, Mkj3+io k*ZKo4^p,XH(IX G6>g4T3Vu6 yѤD+)ҝ *T ,0eR-1J+HςO(XR|-[Ʈ(XB|Go&;XGRk'OnY#'(N_j ᅃsAf#Jm\TxETAldjP%mIBL@hF.d@לuLA^ ю)4.iUAHWUY+6(=<@Phj àe kiP/zW&; M F9`pHzO^ 'a9#)tk!o:h Ny^CPh̦rapU. q@D!C1eqǂH28~  }^\]^__:NwqH*ٖI"0X',9l61?Ҍ#;W3: 5]fO7OlHj4c;O+\]_ZsGZwq~v]Fb<LwNl/ѹmw<_>NNs]Gۦ@w& B"LjűcogT%%II %8'' y$z> D54 4$Ls$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I M=$5HAI (& ĸKI1m> D5 4I `4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@n,) $' D6I-@I N@CLh@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MihI o'=1I c'2M 1 D]?.jH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 |@7S֣%O5nϻo{:7GrG$x:I%VNpp\αhpi//+q]1m^WL@u5@]AA"lP&)b+ |9tE:1b\i]WLt?!KZt]1 ӆ;Luѝ!*Ɣ gcq>`]1e0f ;GhV6yIz҅ڼm9FuZRE<~zqBU;xz&eGKgx5 + lث?/|g4:m n_2o3/) AB(mjLi6bZ?اBG'J|u2X}\~_z~7zu69ß'?.cfXنmr_UGi2ܾ#]EԺ*mR,[ v,ん+ v]1%zu]$HW qQ6U+Qu5@]A&K?hq+X38D]+9b\i]WL38D]StE91b*DclbJ*F_?b]n6 EW1zv]7F|V8f1b,fh(-+&ZgS!;`"9NhAtŸ uŴ!ծ+DuEuV>:ٛtJlJ?|O/֫\5]-J[Y?n~rP~z`_.z4_|JW.k6_v_[=|ЇX[8`[r+ &V'忭GoF7|l]n[q}N)t-9oF,z<㏗?jknQ[[p嫭Oh1ד]}}vܽ=:Q/D PNqL7#ZoT2WK͒tE+ƍ^\͘{sѕ5507,FWW_]&cnVHYYؚu>EO/HWM+uQuŔqptEo+ C!nRtEΆuŔTWJ]1n+ͮv]ں|p1 ]0g! +v]1e *z []1m^WLbȀ G9cWt6uŔNu5D]atKjg$gqF)l: RVTىw;0}~q^ݙKjM* d WTMt*+dLgJJ1VKJ'H0P+eppa^w$P ;9NwV >π7ضW!& q1k=:YO1/m')+j(3rθ E1mԉPTGegGx͖ԀЭ:8?kOŻtDۗ_y]~9{~Z6u:CKw!Fj-֡+y1݃>u:-ڸ>Lߑv}pLgGiC6j*rL:wT4L0R:/{ΕylUP?ʀSq"\rϾ2Z긅PY<>]luݻ;w+S>{Ļ<).#0ևytWWcZ=ԙG*Ks6{|>&Nѝm(B km<6ImS_\}-Ƨ>Π} n~t_ݧ2gZgr_Uy>~~qK #+ת+y13BLqZw'wdk3ϧESoC,5WWcoLfZC Vz-$?Ǹ9HmpSjn9xw|ȭ>.Ι_=ɄO #5Y>{˝sVSS2g3?1H9l O y111t  ()HhdVԧ<Ng]nN{ɘuEu_NA#gq+շ2uRԺb(GWbD6W?$JW] GW1!IcW+Fס]1mp);× !B˅lȹ0C@Lw.RpD#I ,$ED+SWMP)tE<7iUھ(,P] PW9G%Mc$`@9`Mbވ7w++cѻCs= ܺ*Şv),W\EOE+oy+ ^MbdUW銀JWi}]WLAu5@]yr4tM66B+;v]Iu5@]kAb$3Hh]1~)S] PWf_]0Z/FW+uŴծ+ĬY ]ppr:itŴ$h38H]:3kғn體Qz ^` :OO ]+m#IES얔a@w0@v==/P%7%$%YEIER3UT`QȈ2"2'J"d#R$d%(Aǁx'eLg5fMJIQ2RZyI?)I9āVTwzad*P zA隅)Qq{S4{fC'jU<PQPC%) C Hb7O ,2tۗ9wWRQ$zrr#MWNJi1n'ܨd ҠpE\+zlf˒ LEW\Ic+V= MDʐOPUpT1AJp r) ck?\*Hp JrPL+iW\b+C`ÕRg WR3X6?0v,Hf,& 'fg%"6r:L@RŔ ++C&=P*JU=;FRZ>{r#?q΍ZDpD%mJCBCv][y%{Roʟl817 \UNw?_ )|Xk6_fMI_N~u?pؽw7@JZ U$4݀ Q,EtVUm>~N% 6?g~!ܢoN.V8s[r]pH É쟬e h<-W<_IyNB8 r"=9P--ue>w8~./wzPxDzJfEITfGf Y2P#ALSp%c8Ihq~ S IMc~&>#H.ark9% TBr^Pr@}L%wkY 8ͫktZ7fzqėU35P5 ,' vh p֟>VÐ5ZVo~1{o?~u ,Ǽv P|9?MSiW`2b\ ~NA=N'Ghr{~g^߯Lգ!pvMeG"9VQPyS3;Wrf3CWa-)4:݉1e,Q47x 'i"]j禲wRF"$ 7$ٰ*XR^V?˂nvQ@Y趸cX)%&JKI{ bA[nPH.F[I&N4VmKDkDou=i'-մK$(O'*v'e3JIJQ("IJ〧kƀBrFtS4IJ=i-xc\&wS_M'4io.n@Qtp6$ lԲEݘ'm6$;Ͱg 걉ڼ;Qۈ33QcΙDpMf|JP_EQ}W?5z벝NE+ ?}g[VwT렐BcOˍ1KIYm1o-*A3˶ 0OytDQ,0UvlV'Cn|h1j1MS-CN8%BN֌q2YtL쥬)HYTg% zl`X锎>m1wDYwӦ 4vSKm75e$Mt2Ot,YۉjVcӤجŤx+Lp@hԙezǘfiӅmS*\H3{7~_Q]͊b7 2?.`M]} =91`Bh5:@jgͻbd#q Ж-fxomkYBin-u9;֌m:Ԓus7IX[ZZ1ĺjŵ=rj0it}[ RխAf~jk~z[Pj[8W&E#ejd|ov%z_]Wҋݗ2(nCy9#{?|xT'+b2c- wJ*>tRNt~q"X;xvnDc~$sх5s8 XSS?D.j'gEqpJ%tqU4w!^Q =g9ex ه\|L֩aS=:䍰 bl(;e1^쏽 8 %]f_䇅Rs%:ۑ'_{3ZEU6 JLC$2˯W%EQ.JYe9iF]No[} <]`XTӕẇ1Nnq? $!|d^ǫ,S5\ff_Ϟ1PP>?=`m4p~%ĠBoφUYewoGY/.A!l泎^ۛ JrͨLO*  ?8>CgMÃh38ﰻl}E1!jU(r2:K2sVi0=(ظw`ȚEcRf8=Fs&*f*`E&1tfۜ`ͥϚV&c[ݎό!ѿΊ|gظ2))8x1%QbChnR$X&^SclF?6_4Ă#-UtѲ%>)SGRFd9hH[wM+>H*G2ܥvqhGW䎃"I5'C{aG׃i؛p [Q{LF2pJ\.5%D(j4`1f⽷m>7~,>:?F;K2Ĕ+F`V^ C8T p\al^oʃ=>Vm Na!Aޟps}`~96TU7t;Hև "Ӄ\]߆! .Ǩ(/WӥJ@FY! mz(S+|AE $Kuj')W $G(Gx4x|Y/;r>codL *d&W❟#-(=؂f|(3j- MfezhR7&e|X~piӢ?4>4נ07OG~ȑhHU)+_TV!@a)C{=QhZ˥7ɈDV`d{? CC v9,8fp~\>%0* Wcewm=Hy]i/$y؇Y 0yb Oֶr/lK6eR,k0@NOH>XUŒ W}) f-dzBI%p(,JB_2f(M:}^|aIDM6$,©NrCYvs۳K>_zӑu-6r8>E߾Go~?\LMۿ`ty;du+$pJTgCuB@_?¾~/&w[&*?pqf i xn(GJ"391[qi,2{)޳gߜCrd @ک.wǠCd/ye@_=GQOεQ:@((*S/)e&53K.*F! e{i> ;=H1x1}$Ӹk,9G/!SJ!F,EAJ~r~$}mxbŋUf^9Vegb"SFr{@AdCg,D9>pt*A2g%1,:|_'^vzj֩Fv3Z)'3j jÝ|u2>_o/a4~oQ_FU|Ps5Tp*2ĥd.oU%\ê%'OtB,>Ib8N6_m3]InS՛v7xx13ptkEq5xT`h8v_Wh qFƃ䈜 htx\]xw&\OVwɗ\i.pwޚ~tzRt4'kf-FHtDtJ o7 L(BISHhHcCd.S@9ZЈVo ^73AQw弁!kQU)dÓnYN+NbqsJ̪cY4*9+5s |-})m殾orzʟVp17Ζ]a#t"wZR<\. cV4Tr/ 8]7bQZ;zN^Ǡ1Y~C{! }]lj?sz~[vFFmc5yV46 >({ef͚ 0TBj q,!Y>/flShra&kr䁪 ,cw[-e`GÂ`_GxO:ZQҭ:/l)ӂ2͕QL [*Q׀CK9~JRmw'R @_C J݊R!a.hi nRk@sj 7^\@WG9=l+BB|VϪ^)0+Ey1>ܼ)[ &ٕ\bުr;m^>[N>n2 ʨ`\Zg^YR:k撕#"ˈV{!K5#,et~"kS* Xl>MrI < JX8..(Eѕ‰Q "r.]g[{r VJ0ŀH{AwYsr벮VOzxYIJk8ژTmJ'P1$@˜"g<@ԑh=>-ch!tq0O24{ݻ33ux{Rj RLHZf;vw_^" %Xs@5)PDZ2I5ZfOS2%]檄0Mg-9}bppo#24H$`Z2(1s_"0fkŊd/?'ufA:VƌqT|B߹\?@<˲ˀ前>c!xpXadww(`Rb 9mz~GFT3a25 l6&O8J##x[\2 ZhG[}ԬO `mҖ\=0}f{w69υD@ >KuLXNeR¹pyq Js0|ݳRq :qw3yՠcԞ^weiaȺXIF)RwM JZ^:+҈]uSXn٢AAxjE єڌFh `MmPNa ሤs-QTreޓaW-ٴdu[ :cMzxԼ9b;,ӆL3;}6O\x_w=V~=9܆0RNjưc MG(N؀b/N8$,B:D߽aVK!+qrNgmQاJl'kl%ʶ\~lƛoPQ+_*+RCvӮ֞(⢻-˵T&cK,n"9eVHL󥸵K\J*6 ??p=}P-Nszvfx#d܈8"%TtL/jY)'1Kj/168jDP8ŦQ\c~?:E`BT !&q80.Us~t9L){u~Lu6 >nZ?„u*@,5d RN !gv9nk2m 罖``5dv*ھr TY1⅚4 nB66uɑݿT ^ӟߓla)_m\yf5!ڼP*%[5 cDch_tMvX U[)]Y}\gd{l(ԪS7-94G4M FNU e@B-L6Gιp9%0-ìA KZ_7'R-3\nmr2By`e1c~0jkt$($/N9vҼ2So7'h`ޖވ:| =z_wJJ5􆘗.8r>+DH~\OVw`B5sTFqhӆWn Sj֕OģEe2uWN&ICMD~2$pQmYPy!C1O! R;̈lQޚ/f. ;5s *oG"vJjFU0yH4iU77SRdz0c]XHFؖ*s0r^'e /'h.ʊ|\ ~m-Y󟖺0A kwk`\t&V:k> 9{?`û ʂWqTsSs은!w\{v_}N8׭ ` ,xN,WS;6nz tvOyt,1t.q%CkU/Cؗ0C~)zggXI7b 0a9P{<ѐY/%+{ś49mܳewUr7ն^ yۣ$!dm#WIz_4ØSgXK{K cPHojG)⓲0)Y:'`rzRt8zC,!J5J.=t!CPp|\f-e4gQիaJS5<;&&L/Vt^kEqMx-q#{H` lpKYo!0ocY&YRN?$bf%/yKPf͹+ R:}^{-Z!Bl%ivWsYkebŀH{t-vv#@%Kۺ4.# E+.}aE c@pnxʩ`¾DTZ "^fDo9)zzzNJ.'?LJ%RJ~r䊆[qMZd-rJmbσфgXԦ_|·U~)SR }C}ϑ hឈ¤Vui(aE%՜I%,|.Ri ѯ_|fщb!*mƚְ@ E8.LIeni/V<ζ?@b=m&r0iO^ƵDX@H( 6^.Pڗ#[y:syOfky \L)mSŋqWP&lڔwTƤ mq<#X#|ٽ5l1cK<(E0ahMfGP /+hc{U>luO9uFVY]6WniG:Pi2qwq=}'8VUng'Y2A ,Рa|!_$:E05E@`LZrVfOv<DZhl9~RWq- h5W/|+r(3˜ Uq3}}b VޕҶ8;]/0+PSk0JsJJ-_yCF:rfu"(.I$*a:)cqE\D݁i M;.nfV|ža0 Kng҃#R_aO?~pɸXVJb ` oE B<BIiFhi[sw]($z6 Gzgئj]4^/]0^cQ|.(lj ^QWY4ԕ)*TjOL)RG@4MF9υNZ(i҄p1akqKt 6|!3/]YǓH5{jS<19INqLj-v4vq0 ~CU@ړP3<1J$PPf,B0EF]C"<'Sp(Vza3-^yrgfۦ (eCruy5b5[.IA/J(11kEq"29*"Fi $Z[?&Ά Rua``Ly*aiW({Ұ%Rf[bûQ 㬈Xҡ+IsEUm"L0@9MW }M[h4$jz BgfP( GH$ & f"<1i|=UMb9w` 0`sQ¿u.܃z11 ZaU˄Dt;2v'݉w!#0Xa} 84!k* >嫙~r zLrDCɽL}rFJF?'[J ] Bc@*͑^52`Rwm51A3̠aƝ4qoc*>t{># ǜ%1A_G?lb1[dېgG(ǑaB:A8aZ\lyh^"XDGiU$:QT& S)R9(Ӗ=e`5d3/|[:^l2N]r)Ffx?E2P6Z|~>4=+=.zÛ= BѥϯzzN?Gv0ouLu&^R 84Z1 \;eiq5C \}S3 um v~OVM`i5h"1ouuw,zw7c-4z}ݸrHL3+@m%v4\sgċheVt짭ޜȮμq;}f8?%K֡:$BcHֹ^ϖ/Six#(wDlŇ9댺Zh][{i}D(Nh4}tiS#7ܷ;^XN^%/^9܊e{VZK _!DK±sB>}؎s}JJvET%?;ap'!'(}q^%jj}WQ ܶ ] 8 ~*HkLuTGSǑuxGb[99}^t/KT}@T!1).PFg: g > ǸQw Ձ Hus|ng Af5 `Y\(X?~u]B)dBm$_'ykΰigTeub~ݽTXIXy ݱX?N_[-:oZe\;0Q2d>R!3pUIJ} ڔ92Y#2K]ň:CڥP> # -dS|'7>Sji1uNsolᣡ*KaȳaŚ6%Jȳ_,un LI,}pmh >IЕׂI;>p\=)D\8s즑jogg7ubeup:|~qCw@f&1Ki }};[:9+Y|nzE"^va ~+FmL)Wg=5z+&Jm[tAԛxr׼%wh~ qUeQ(@.0PES*sSJS98P[xt6Y J_ǫjedB,\hx_'|Ѕ"G;0=-4pr^5,fr<и7Q4$?>l8rM ;>ڥn-B6{Mo.Hq$Z-mA%xa90"We rΤJxln znve^y#Fu uŮ_ϴx5ύ[h /'|1|/E΅$W5X.QݚhgR&&ˮmUCGcB\y"CEs6U{Iɗ4||e\{u>X[^gTr:vTPQ 0\1h%71(<9XPy)+W0ܗ(Fn:11I1Y ;)*S[|! ] 3)YF9;+0W\`2!7Q;;!"BQJ3&F0Z_I\mcNs$,7`g~Md&k#1ٜ 5$H-/oCη;qJ_$رL;^o E/߯97`c떃19 Ÿ Rg "m{٧C/CYz0 `O8Z%QZ݄*&/:G"Z| .w^ $ڙ2Cl4Ir>r|@Fid S)2E7ھHYUDSHu5;C,pf9ػ OA"c"QuEb{Xmh~^ 0b!z/R׽A8D9 Pm^)ErBҒqnq B̩Ny"cAeL{\ #q`SD :$ ;Fꐇ9:v8uW>t1ӡ[mK:jXsAwf{Lwn>|W) !`F$DBEgla}ᾙC-)gzѴʝG2@P6U`dtSrR4/n[4ʦM6@#M}㪳s Q%Q̹C+_4Q?`PQ F1牺_]/e5KDYFBk|nB?ן$HmJ\4 Q"(WBc_IZsSXv6yYPa>PM(ڲK P S$X"NY~Y@(\0REx̀)\v*Q(=m"NU]~ ˹x QOrHq@"C\J&A gZ{H_!Kʬ`1L ,fhSD)YAޢH)j5Ȟ/,OUz{ŵi=| PN\k#J|sPphd :FRbG*h!7M+ݮ,[1و$%0pa囨2W}p#{[gKƣcxVz1*Xdsl#>K~Ǥ" kZ nLwD;1¡8pKvQ_iK}-9Fhz~j ?2i LG8N/2|= FEw_PxV.qX@^kjE4[0voOH%sTMoǩ`rlȞԷ K*zvSi/X<9$%) N!jxƳ4k9O'H++c@p%u\}3_͎η'1G}3p .%yѫGѪlQL N`Ih a ` s LJ9ׯJv2'JP7'zzy{YΉA;M-"ty+4!.15ଇq]ˡ)!2]ށ9yMS5/`f'<,Cŧٻ_5ev0I<}6rKgq.#e503IkU5`lc?\BIU2=ͼܝ3 lX܊.*VӬm|isr;Ob6ܟ3X-*A`*R~mB8Q 5`0(sRHJW4՝[M.۬xցq wƝ҆mE3 iI;l[IQH2'$x&iom=4V{7(Izۅ!-X2*LPݩ(haNyb~7鉣tBގ]%mx%+߶% Zd%}y5!|' ʷ>on.r8J--S̏7aǮU&/?G+8XH{LkJ'`0jIH//a>#ok6ݙ: ܅,!;;tDhϡs^+Ez,u[@0erDlT@&?^x lTs[H8 x1rYuC$nMLJJP:Su^/zS;MR'a rRm$܍5,mC&3vO}3vL1JN޵x`&x$ciN$CBwJԀZ}y]G#`\rýp]јizb-J xP#NEQt(=/_bŽ` SL+@+܇Pd$\!N")y)j1%)^E‘2 ieBJ\;ր7~1H!QFWR%x{/;83&3,[E"=HK؃ LORˏ(ZE)p} uhdx5_m 8rIb2ܦX"G%!YIC|KLk1@ K}P;(5IĨp2& %M^:jX5\Gx?W (IR`9F`I (*״}E8 `2BiZ^rыI׷*#/*i~e]2Vj S_>xxG>Mr ?g/EutteL&b4p9El,$UIrc%yoܛ =j.4ZTe+;EE\lNc fhKQu5&Tu/:aa6:Kֻ_V,1cv!>fS,O*Q.öcCr~1"4+#tӉ0+@۷c@u?7̵o/%ieJ{[&KcǃGQG Xpl.'m+/1־H-e^WK`:7rzؗ;vش%A0hvhDDE7i0*[%P}@ä~VK أB||c)E-[9KWb}?~c0 _`깁FLf;'}pait=|U0cb(iD+Ã5bD7[K&֭KK-TWpKek+NIHÉUD;tČ{!ҷ汮St@=l a~q;ˣSJ=a,jOp:7E"zF {y^RZLUi\"ȥ.Ui6; ;GY 1n_]p[e١KqXV|si:Q8u xPx:Ѕ5v)mM[\`l q90e/EBt;6gK0y礓&%-CdكW4ـ~EaLݥoax#K4Rգ9\zG޾_b<;+X:>w1JOEAIdka f79? thƥjі\krJEv_5l9)"  2>̋ =\"B/>C33o.c@ o5D0t~h9a9V&,XR1 aTz|]{8 up/v'? "C5yY"0^VQM.g:'y #d5`'OM;Ư* 4鈀ԗ"3{Eo} TC5\y2YPrjd,(o1lENكu?-|hZvmAt~to3lS:k#5U`֠$L`dU>{p!>VtќiX6jCb0N{a#t7>E|>ѯqwm$l~-kcweiEyCO[|2Mbv9ohl񾝃-X_O<-G 1œrmwem$Izi̖GdF?F0 #O[mYRSDRMvQUEʆ:H"2#/̵v"͏1yhQlYOq 6M]~}:{Xbo9;sãjg\2jM6]/^?pvgr7iqk5EOϏ͏5<=>'YZ4;aq>8me 1?..rOCo$8uߏ}LNױ|Ke<6Wթ5tf gX7A;NԌС4vAb >wV3C6¢#I5#PVCo:Ӭ _Sm%RΪNkvRdXƾm-W<>{!,w4GvHnB;z2++@},4za1wiUT[{Ѥ|£_?;}w򍱤>RJ޼*]IҵkB^Œį E4k=r38}roh`?slQ9ֻ9Baؠ5ZUHΆJbf3uLYE4vT;Z#il[*"VQEaql5r{kCIθIPL`Z  : @N3U2CU Q: …V2n rXިOa+'A}2Ȃ{Q#U]O=GBMvДFmcwRY5t7L6ǹMQun>C絛a3[c'yK9%;vQnJqfB5zTÿ,姹>5~~z_} %gXse\ )V, A:gYw&e˥WWYlS)d٭SV%ٱ Cmu$HIG)X| S6H &'6; DW:vM6j^\ mX!tv%M W0Gt #XǏ%)E`2l$\uKV^+ɍ^h]90zv@/ZNvC4VIFyn~nڍ>1 ߹nU06kQ 0+9}B nlObBc-:V 5"`}Luew {bXɞ,`Q99t6 渉\O)jfWYX%yu& JK aLPΔAΤ$z!PSDz*jvAz .vfCk_xmfU4?=cɅ~<%;]I2߃"DfoG+"Q$3SMR(7@l!T0`rOه>CD@@V?RW:V 6/\R/uIώ>GCE w$ػHU&XPN^$dJ0@3R|N T61D$[m酟.E]';70yF4+I>S|A* 9kIV;+ی7cjgRfjNڴI<:Qw'ǧ>wHcap%5 <+:yS0$)!y%4v%':ė@ n>YH!+iDA~2R䢼-hYh Sxޕ$CP*(;L$fPɐV?'ll9:mqq>rz=m,0ӫ 7mڛRdCH@چMR(-<0dSGsh.>B2"~g3*m7ѣJ) RHٺ!Y%Gtߝ_lիCXJΎ/^tΰ{/ԽL5ٚ6xe U2*DPsҼb59 *<2:y_up$PH([eKL5>682h),j BF\W{c@&pl^)gE&PJw•"2j#hN 1aiC1fv-ر#$ͲM)jcdVc3`I]?Wɕ#XBoigsd5{WΕ~w(f̌VvI]c y xWY ]~-jApq~tکdI8I(5]fv*lG$S!e>+-gO_/ʵfW҆^p˒@@͓_OX,L^( wb#%s{Zi[Cmڎ,ljuTLIR sV;#c.H@)a0 ]U :X¡) .]z Sp!-HG#ټ? MС5oz ă$cgK;\0kmVςבqɚ c҄f=Rtcd$M,)C`sY`r5[$aڠ/5:6br5d\MUtXi#V673_7WNr*ugl5~k$nRjAL~z^D*9q, S=wHCEP v%E8Stˬ1!&ŰFOaBI0y`/}[rLYf£{@Ԉ[X>x~ףd`Y T$2Ue•lԫD@lsjNZ3F!5,x^|0K2Y,R?HbQ5S:I\F(']dd>(8,񘒕?@[(+ߜ5{By%ust'PD%ZrlMR/N0!КE@ц0d Y`XNvB0έ4)O)}Jݣh|5en~*ūOB َ vz5 /hD= xq7d HB9g mHrrֶc .CESq]GR, `ȟfaK(r.%*1be !'lXO1$©,d֬1jzn6tۛZRgʽRJXu[:66av{t6@IZ7M,`|ȩG.7R!xAkDYљ`Dݻ26ΡlMx:c _#QkAwP*V5&O>+ }LD͵!MdS>G/ޗCa.6 JcQȢrfULb`'RQbd.cT>4^`An8M-ʉ- TjЖQe[f}sB[:kX9.L1NhњX^B }-Ȏ~!; -1!]3g:؆`,J2!oٱQ?5=Hn`ܿ2kn~%48^\?|ybNŔ:᏷FTDZw׃n~Wku!G JP5"蕡 U/=BPH IY U7TѾ9vDɓҠ}[j!R$7XCbX Hj@6çL!j ~%a] 9`XpF _=y }@zZ7^{݀W 쨧Uj4JCv7ؼiOJ`F q[ jٻި$WDDF 4;Rzz5#t4Ƹm,_\uTEKP>>}qŜ~7S 9:`.MuU^rMɮgR2Jɼ2e"߲UlY Ob~=m2A9~FrQR`cA,Fۍv3w>.#GRB̽hc6,fy/S@;]놙S*#M3|ԟ`7Wxݨ}Fk|</eV$B@^9cF2Ɯr6}cC-&i8>8 ̻??M D!|fbOJd0αLq83cb: Sc`?/"V|K23[LP@>`˪\N!,ZR:=WePL/bzˠ2(;ѻ!XPX7r+,` }=)!8}$(Ze ar=uq`=%Y0٥z2%  1` ז׆27ur&qC}u!y0e2;K܆ oJ)WRk D|KZe^I uU2ՎJj'5C"Q5z} wIuIŘ;`dLrp1\V!é%(ۂReC̞)-⤴ oYybş՟9~zןp)_?vF)E͛)?̥ 7I.rN+"g_T@Rsr4oJdw%R}|GqBsvRV;{0께 VQdMrNk/bT4nQy߰(MԢOINb͒360DsVg,Q&O#&擓?`!sEr![$mtW X&D ~ot%4-+N%-qARז}?_4iαw[ymt^I }H8Wo:R[x%f')]lITeUS+cnbgה!G56zv~igGweu6UsX6Wݜ2;jo3}5yX2t8\;O,$߻X,@ɤbF*yak¡TVjŜAڳZ[%hרҔ{w猘m[_9[X9m_>J'2gݕ^m5dJM T\q)>Β TCU^Ej"b-M DAhŲ_h;omy=Lqz?k!zظ3z!+$j˳lY/ttAgo is mdV.U(\2 c ַҠV=SmR'OqF{@1K89}> 9Jj ]oUݞ{,n[ 3>e/z% ]ɬ؏B/+{s~~rs5?jg|@x~ Oپ']5_ӭ>A5I_/' zBRFU_*ӷ˧.`R?=m\Α^>W){ru~z7(<sM T;  s˜})*ONӯubo5*k/$R߬ !B2fZ -.}jVq-W%͕ f Kl-$- ?r{=VQtDY-X[/#.uQbtV[uPߩsbQ}/T%35oQ]v7Ù9SeߗU/z?~2,m&J8Wwd жSŘuvS|[{θаhi++ 4C1BCzv ^pJYEayMzrkbeFt- .Aq ^`T^Te?X$lT$Ա_Nq0Fh!T\4Wأ>' 5k^d!(zY4p^ET86Q8y[yc~q F ֩zϻփ I<ZnhT i^ +W[̍R5'* $ņ1ԚXPqtͱo8cRYWNX% Te[}E]EjŘbf\usVf0'ت@QׁMJR2NJwҟ0\#:@f.g|05B_% ٣[Ђ/1T=w H9ܾ?/#+~L!83z燣 qCpY~p8wu8wEmj̏noE2&~ z {kI<={b{wtⷈy؊TU-^-x!ުB+c~x Rv}Uh)gT&A6n읍 V+JaR4 ֝JcaCv@RhE3 ,<8v9dGP&aVJ7A *$j B*R 6cvyA},77#w;2<ɨ3F/qvή[,{M)l#nes}%18=-%gn=urٹ$ qr,|f)60tn[̞_G0@>;2[cms-*eDS|mC`I= Tn#䬌49EȽZK+IZ Ck-R)[- ;hy(@֭^ͅTX 7a>zwsn7r3l_IednCjN-ك9soaFwU>x] -f\%V62iD/eAŠ>.y)8+q۾0h9޳n!6c~b:=}Rό Z֝|y2ުc 8*SMǮ0:뱨ep#O<;i)ɑ<\k):GGoݻ;hoi#0'1{a# m`MAă?}WO:iZnڼ!T18ǶAvDsbҚg&^E<}=: 7p619~LhP3PW֬HОZp:#jtҪ&QVeC#ȫbC"$S&Pa6G%JBPQcAӬCZ3:ڝP9f_Mbw2\m2j̮"{%n5{HNw;v771OOd됴HSQ $}N(-$JlVJ뜱DGsh f280gR xd{zdr(ǧOގ-m:p|3#z@r`򨡺IfIG\5uL<@id-Ȑqr0[cqrP6ڒ vLO\a0 :6rBd}A ] zNlp#EVQG ;N=J%>Y8J"ǣ@c]ᇇ+W#'-AQoAN)7pmgђ1 L«x_Z:b+A_J({X[G1*I V8u4F@oқA 's*|TW2?Szcј['ƻo W"^P(0t s -VjmԖN~)ڂY _`fYUDM XC]pntO_$[Z~Τ@^\=HLZxgLls@\c\3(>FwpSr6ϴso*K%7o_g^brvu/2ˈ.֎rd+q^ipPڪˀ8G:\JI}3Y)`Ւ[I}rBh),*ydqQ5)CziRȣjMw$\rČmڐQ-ȸpzLk ̅ɱ4LspmсƣguT:aN-&1FM~վ+n~-V{F lu渲0?!}Bmyn+Yp}T):46{i:#Ůi&75j%+NZzmɴpvɺ\<6+;iYYaԒ&zS.Wy5]^;"(o+/\@תQz3ܟ }Ɔj(Rw_Xf:Kڼ)V]ϝGHWc[NEٍDݡhx3?fOBueah4ףj%C`8F,s+Z]bE.Uz0JN@TH[ cttx,?.+uex.?jz5O4tS6KKmNX2K>gRQ˅pe*dszro89989w"I='kvJu:_f \^^t"vGh让p 2SiQS{5ާW^~[=Pt4iRu-,, s3b1 uWcfRF ȌUk!8&lW|$p$ZQ־ ;Ύ=B-\@<sw zi;PS?<9` 2x7C"xcNs6ؔoTb^VZ:+{(FcƸ H-gnUlA.imJFSHgoٛC䔕8_"="A? /vsΑ[hRC QVftFxD)qx*Nf AMeYyS,!9e]/k C)FcZIFW[Ϝ6ɔ][o[9+F?Ix)VC{}{vAW`9ݙ[eIIGJl<b Э-%Sp'Z{vz_ބ<]{#v\…rGOU {MkVm_4ڍ Y(i4%  Ң,ׁW rl315Zh霶[t#؟fX2b]@jk79fdmq*m;ꜭ̗0ޱy뉒=s:poYkB 4yD ߟrwQ[QE1X7e\wuyye.&g ﲳ!(ɋJ]֊Sk U]9#ۓ_bɋ=cDܘORYn/J_PFQA:Vn[i c6dGj=Tgۂ8@vW1dϜ2u=@>Ry;lv-әVy#=9Dh=FY7ܪۙfy/6tV0C\$%iZ,TQ*)r P!1#p Ĭզf16iӧ1,#7`գ Z39dϜUjm*ݘ]4r@z\4#*]4kE8Őpf쌽$4h`aS*.[@GhsК֪շBfP 7BWvocv;Ek;֭RsF\# 4xf=Bg4cƒCsyUi NFh+x^&cvx8hw}2FhA<ZoY^tT 6(+*&`GX!z0]:v]^mj(S ]sUw%jKb4FvZ vIlI Ȥ"$:ٜ9Ul3'SASAFHd/,8RwWn1;kRhiLU8Q -4`focBO& dZd]4txX.Vp. %;EߤCCj{}PUݤ3 TnK%! UWhjEMlx[՗ yF;e"8o4?hJf"T5rkWQAHCh۞MM$=^<};e(w-`XzWR-Tea[/7׻En6Fy(w9PS>'*X'$6ը ]e[i-A-`9Il`DxQL#RńPV˗ "!*FE tlrdNЂ]u&Z64TT}rI@rc9Š+a$FT~ʀ!qPD@{z4F>GHٗ"Z#qluhlkct(!9jUbBcCF3,qXͣgK+vE,ONgq,o^ѳ=F 8o^gQĚ.tX^{,n} 93+0L+:uR:_Sd*lۚm邃*UV4Y",`u.'dQ(4ʒ"aNb+g1(p9f]|ٶU l̉l;5qj([ EJѐ8ړד(y.IQmgc]y W*l#m,O @5JaN#b*-bˤ@L&l i#TD8 *TM@T+9p҄eTUť6F_ EkB47ы7W? P|e=qе(.)6 VC:l]"*aSɄJZ isbDFT31g!ʿ ,4WnxPȻ Y~Aqdn -d\Z:b"cJo3L$x8j}A^Y~R90sFIǜDr\"#3 !;w}W+c;%m9PylN": \5紦L!Ww"3,q=@SmDrmدgixsH<ܲYE"(Vkg>g,+΋HE'cC뢼 R- "{rSĖE` " K4%p:R)D,_AY&wŞC/^0lN }viY[H>%lla)StD-D`s 5UWbP`[F`T!PC(Z{Ш4.Lbè#_D:+DjXpS`[%F4P=umã8aUD-Zk4oo#Y^x@NB6Uө}u]И;=R:P&&V+WK,MTNϹMX sBz_&\)ْcNeM+dQ)mP@UkJ@vFAq"1xmXw N=̩"#zyĂ=vf$c)$Uue$@pXE!QB0jdPr𦥞2SDa?<:Hr[Ik9$gb]&;K{JH*j(ұȗ>T*J U BBˑPʜ%1q;{޷̭֕z+i鸒Yc}_6oe<:;gS`?`oyk70l[E[TO2;0Qw}yrb(ђlEAj1䬪*H!~\*dE[t1$a/rS%,_CܒeөVH׀T%VB @VW7k2P9rc7$Aӂ g!s^'eMM٘D0W*J]=VG4 vh-D*fB㵦t-"~W0)} bݶT/T=\&(02ޏwg?|^IcY,p8cԿ?͂Eߔ`\3 RB;AvpB+ ! C\0gaZWCxX5BD,,& =G#RyG}儵8dpt9%W"[!EtT:[uޡ z6t58m/GuurٜsOEײU˵v[=9z4^0qV3Jf"/Bܼ'7 FTj9em|r3> p%-pމk,ݱMihW6H {6J,Hk:m@ mb~'+y [ yk:txK?&=(Kخ}ٝ([+?o9Ɍ-9ʌ+ocCVaOE( io$:SwNkۍn٘њy=Nx {֜SrJV5č*vC^:kk6lYn w9%Wl 𘶨ژF-yPvM€m9ȾsYs98is2锷Hz@d{ޯj7VJY-LlROcNFJa˴} k3(3e0<v=//W7Zv܊k(o!#>zGzGJ1GWdӪmJx~Ӄ٘Ѫh@EYwժsJiw 77]uIeKfcN&0B'v6gNF2 x\7e(E&2}`jU;^Ýft웒];P_DgU?gOwI)"&53M-y9u~x?`^l >X%8 9|~zucZMGPpBC%g.6^->/Wev&*?}f{M&}Yb vH:0Pj!@jZ)D8Jm={η<\/rwQ9ď^ q9cbK_Ml勏YיmR{:Wcu! C;֓^"yk|-V{4(e5(c\` 8T$FoӶg` 8#o#)mECr۳'Ѻ%`|؎^!0eH{x<:~1oiӘ!sC?bN߷39O9<2erTmɕ=ioGMKu"fq2HLiHcO)I6n6OĶfW{}$~X|S>>m&O~UB>y6:Y*#B|[E@ʚ 4Xؠ)/He7BMVXMMFmPvrqC|⃆9s--B}`̨ 3?F66WkI%k|J3+* rW%p`۵٬)}|l"k/"SFw*XTT)Oe?! @8`D籙h[Cc7wb@.wt4xk'*nTb>;0ohJO+]9W ͵{we|wހ{>π7$]uWrzC϶ÉaL<$(ٻ%gŽBbQsN1F-ٻ%kg Ibl&idJ̒xxd.Cd2캪1&9 g -w&391/j2F7mGy@KO*L԰@0^ޗq.9[$x7}/ѹ3uGH?(S뛫~S \;3Cyhr?5Q}xSl[>W2,a*u I5 0.RN=5H?P4Ŭ \YP,Z a:f;J]`mqcZgP%'1 /,?`  4{Θ=aF1`ea/$9y a겕A0 }E~Pl-gfJ+TM"ۛlr.҇_^],,[NJQV(8>t YֈatY A(>iݻV{jS@UI@fd&xd*哵2|bvZib)|u}5}H<_{F>Kӯ^  p_xOɗKס=ipקE܇nTJNѷ)X+]mȝ.1fUDƓJj[RڰDE{0Oxb\d=:!"zXꝄOSC(9w 朡ޜcˀPAmE3ޜ͹:F q-vG{`2Fa8ٛ#'N?m0oO-9wYԩG2=|3LWաъVz ^) 5򾎬)[>7>75[bШ+Iu ;Ӓ+Po;=#g X%L9kH3[:_%;gEz; 6h䂙d$MLZO˟?翽.)sK!'HӃ|!*>%@2\f. %vu}}q7TE4=&3%"(#F]%: m1 &)s$XfA\XkR#rƢ42Ȫ\Tl$0%;KF`:C G N08.1D. af8}Β D:KëhTe]ncIsRQ;pAGAP3I1ௐV0r8:hιH7 NIetnm .dz% nTbh!"i䊕QQ':9ւp0L%ВN3p2Ai(#ü.#ghHYEU*4 xz3Pvg]?f)V 8J.FdflP( <:d*F]᾵vda:@WScr}a]&wwl'zV: #ʌTo>g.?sTXgo~#\I"/ه_ohs(T mS^k˪TeE*6592@4K.LY4邋䕝PKDoMK-AH%3y(yWd𾌺)ʨE2Vt+S2L([l׋60dG-{'WhOvی.(o0U}U>9n$j `%!E4rkl~fv8Rl6h0G6<<#k7#|3VV<̺`2DdH L~l쵞"*2+76魐eFi|H4BҾNj A3shNv㿗.0c=2'JRi|EیS$Lʄ HOvz2Fu8GRfFh/N{TV<~{! B fo-g]mH`rϽ2z Cdk?YmT@Ԋ*kupD9'F ĭI5<96fYD8 rBa{AeTmctmټO}˿LSilٷxx vxZ]{WT0Sև2r/ jfL0n;5 NLg`p %l(p]-_&{Tnns?AF*A ɼo%KH[_/씱g=\V@{S@9vIofl帏RLͻ:UΉYRpnm/atHgOo_x=-ũ8w=(-Bⶍ\&%{O:5t4|0y &甜1'X~7@[-ܻ3Њ§\LaLYE'0Vdhgs}th:$(35J Tj-t-޺N6ׇa@F84AP{S :T c922}]˔uY&thoTV(qɠ"35DMJSL+ ]uY2U[́L2!^8Z<.Ǡ+SDo¤hNVGSs7Q#-Y[F{#,Oq?2PIٚ}X_Ȩu+3qo>N{W|ն|`WٜAL0;E h8:S`6+JP0Rqv0nٲZxh/3\Į8U-uFMuج]d DϻQN}Ey lGЮrU|]kqV"zba4=i7XPw&iʯ5%J71 k 3<0ќfrP0 d F fDBVrvPघSHiYkarKAĵY; H(LQ'%9GaHiJI) ~c;TY=(.ZoBPnoKRSP7_ץb.%aL%InLO&.ibs]l͵w.ج%e>Xl4q8%hn)D:prQ*dҎ%Evy}VnvW`Z×HL=]|`2.V$|`@|@WcX]Jj-扴;1'Sf$ty,^{e+^Y.+Y"Khdk>Ϝ9Y%\4SL!0|LQymVfsk׼"T*W3,seQNqZl2yo/IpmS$1(uT [J&.!vK '{\Fw7b}tY&Oy)i*Xr|9`'G1s#+˰=TEF)\S$_Mr$,8XH!vzOxRiPpx(Rh~nX*C))*%<<]wt*|tN{uXd_  g:NVoK0Tn sPH1*Z}D' 4#T*+9ۗ!.=5ߪT fi0b ~;_ ՛O^irCheE}0{Q'c卖?Vg5r*LcJ+SUG:h(Ua RDtU+:QV L`;38es?zV[(uvC%S+8pf#ٮvXMѳ.e3=Yu_i('r.b6 ˭R|Y5\*^泉 YX#%5^q&@C-H)FI:-/_?R-֛hXse'QK.5SS*.q1|%͹4-,7K46氲ZARnLlԉ!'mX;*!o Sfqe6# $Ln<ϫ)8-DWAUQ\jI2qpcSFV!əzq0rpEiRg#IԫAyex4_"  w'd+][ @ٕar5hZ) J3+PFJ!yΘ̘N4&\ !2!\^*׺BC 8~AgE!k/a;ߪU o>A@3Ba֑(U]#HPBZwZ ]G 7Z.hf s* h,M ((ځb9Nqǭe) _ Dzb㇖ ,R(vv1+<] DPP勂ǮE͇'$rO٢[RjZ9:EQD@QeH擟QYEw."փBjoa͒>/%Hjg˼W p3x}~B+UoG/^bEqdz`@~Nד.Ā ` t;!{40GAHp.Tݡ%O8t;|(;A٠E`"I hn5' w:I3$dӏ1ce>jzs`}9=m=j?|E⎠G 0R.yb㭫*mv]}|bA;kE^Al%ؓt3dsZ-lgrܭc?&BW*y{cŦJuEfyBfFikܿx/jz]_aWхuyѨ1./}V*Ņ۪kC4OEt|8,J{r&TlZ]+D*Ym#L&y؀?MN en $Egda\cgL.lI%RLL.݌|& Ь~Cck.;D]%MqQ)]#)BN ĄnnBǒƂ&q2avZ) Me^6RG/z[ݚq/Dz$˗.=C~Tk]Tu8=$i\{{Ӏ,]IdW)W68=}0 d,$=tx{ESc~JB9D)%C~W6y$tR4I$k 'j' d$Aœlీ J*Pc)O+KP Ipj j5>QjILOmmjujQ.m }!F2f҄Z?M #OK]rp@;Hu{⩥:]#T_'40Gt}s'b͝v!] >{J 'S.N+ĐoO_7ёG8^Ţ!䈦MHbPmڽiuwVxI Ғ%Ybc1֨sr *) tgIhEΩ{ eީ@QBF. J$xt#mf ՝f@{8e1#-8ʑ&)wSl7E2۪u>% m|^(r>eoPǟmuцǁ3vb&8嘆AСDZ<%T͔:&E-EB7LRj01ΖqjDZRR_c1Ęr)-3ܚjB.QLYN@w2u1 뭔c0UgV00yF ~ {X=u#q(gR:,Ur`h\g9)M0Z"Xd p?ԗM_r1k,vq8tЂg]EV?y =pHXJ)E2~/>sJ:4&#@!kG s'ˇN\1'=w=qBW^: _&~Ej |!ͤ7E=OKޢkqÓ;yl!o7֔^qPUS (k,˔ 2Wɇֲ2%myY60eM !hs ^xd C5jEKldt{1JLYoy1j_Qc..:C \,nD֙oRc{$ճM=!qkd45@J5.kPNjk {xpi¨щ"KJs!\Hcy,Kg!!< 9ՐNDX >apGPXjbpȷDFIPsqjZ A1bI09`^@0FO$q?N*Ʊ:=yM&"88&to$c߯LZȋjȷG@% OKNP垢M0KMg;-ٷξK7K>fSQ=MMeGd8ȝdnE|ea/JJ+?ci.tsGUT8a(CWR*覐 ~}?1=7*M_,W%Zd,6WK/̝TM-|֧-U8] YpΫ3/>ֳeo z~=0.s g1PI2np0ؿ&], f߭\?{Ltb육غ,ם |a?{^f7y|s|r3U73"?,lbpioůg^y|߯78/Mf07+X_._Sf_vL8jp;=| +~tNϋŗ˹ Sh3AV_Jſ ?s3ˢ?Pihd467:տ ߾8L^2꽾#XbآE+TNaTAy&XN | hb_._CU5"^-nr;+z>swǴX^: H3w.@w>0c.],7?|> Dsjp/c?|VO!,4h!'x\Xx>>z9f89wolX9g 40- & vѤo3Ȃ}`w@̅b)*A99~2a W/wm#ɿԼ}8Insv拷T``1(J˖fӽ<%z}raRUbwa66ea184Ȳ> 6I.ZQepTi 'ZⰆ~k~ivI@Տc8}00Lළ ^Lw +́lw-a o`ϻ_ ij1mp?ŘJLy|s&f䂘#v(2L"{ǵT>*M΋ed4kZiٷ:` *cY!RSB&$  5x 9ghʼn$4u dM3wl3g;|p92`![ T<7d{CC,I?Ľ!!א-a;V*f <,D&iBCDD' *2hJ{;DG;kM҇\Q#<-,1Tay0 q4su"NCªv7U֪q!3Xo5mzp:+l4m\*㓩Z>Ke;;x*YUsʤj.ټe2gpy d0MJGCw+t'nxP(NCWN*466`ZĵJSFUUJ '>o.M8iV$'(䍙?sp-^ )T)[_OSaNMq4= %BLstH}OPSjCMU ՆԢV,EM.ۨaaC ,j>SC_%\Mz%U…}(]ճKUf'p'}vkrg'ˡw9.\k- N{1 *Bx2Ax %0׉g<;+3X.vQ_GBAAS$L*ˬFqb(ј4c(V\(( %?_B7?bڪYW ~z9J,|?pFD||&M#lB285(bi""H¢D4Q&%"P/i$0jO#9:pqsk@nXʘsɣ0ԄF:TZ$!W cLqč(DI :eIc``& bcOAWJ+ Ls<ޅRq'hi$"S8A).8chl%[r"Gy؍Uڇȷog ZRxyqQspɠqEQ$KgddĿ[twN؅d>h-5vUSS}y H E@TFKaĴQ$7K'BrT̉ő&"8~ M,%R "plq"cJ}5Vg Xi G0O8| %cbc Rh>bx 9Q .< KmktsP;<hJ>r Q-DkEeVxt5Fb±IU1( 0$Ibe$S`!`"B(?j\qF)&;9FD"Zqx1L:6Z}&M֙/U2Khajxn0h˛|2ъL!r~76|}7 S$X "Tr~,(6v5`3/p8EzBs|J 7wVȹ *%1۫/k|llնth5$g;{^:<}x[Pl~첣O$7ӵ#IMtaT/EPJpFMnm!܆: XGS@޽))1-E֐쩰phi ]ui wO͓*z&>._uPAP/̸j 0pKruA.x{t|;QӤ~o-/<3{L b/nNU?ouZȳ3GoQȮX߇2S/iy󽝆.W]t~bWuWhE E)w64./߬zy 49~vԔ=97 ~Pah E?E0 %=bۭN OSBmC $ 2"B&aiD}sW׍s{zOz1R𙷎 ʿ^߸x= DDY%$0I||U|AG&YWl{CTp*d]^X"x\@6N9k$BXTq}Nq,;[Bm5x+}t|u+4lQFZ衱Xttm}70o#>/|'9f qJGx;$mrURS*]ה]&UKh`H3zwr=FA-;@H8iZkk~24p*A+U8a9}C1"O"^`÷vۤh| 2FK$5$ig71j Bf[M`IYØ\m$4>[L*FpFy8u~?X]WN-P==R<% /x^:/GԯR-8?6^MH +ɮM Za!u#|ӫJV%~MK 65fʑfb)щ~4X&҈aWy޷h.0kоL~u* FNVO~V'p^ %(^Xik`V \;o`WAhկW۫.l!8SlzfEUK-nCmf"]m[~z3m'Αmҽ<#UygTjSF6J~Vvc;2\z+ZԷ"6"~ Q7NhUw-W֪))niyͭ+^;^;uC[+N-2"y | | "B )-M-һx݉j;4M.!uQw36o,ϯս3ZpvYԲ R]sTї]JTTmKXO} IM 0- jj8jPҜt3RyyJ[s0 g\ QfE\,^܃:׊5yxcyþK!sw\L~; ֺlAp$M%#FQ1,XoL"P !X P5pibv\w䉻G?Qi73V *HH:З9RzOYw= Tysz˽EXc!k(R8+y8M %19NWQբ?BDJʼC~\9o^b*Am@<|yvt gj śf?AmU&f Sѐgʚ8_A/a]UX9`;XOZF JԆWO4Q=e, `]_Vu|1fu"D$V\kE[o2$zËnQ&=7/gbYI%0{˳~ Ҟw|sjb=FgRRcʗË}uU(菁m@rxxϣ|(dp`cepx c ;1p@ o12~-]R@0_"s0 iتN謟}n *O7˙hJf!y6DWϻҚcYk\` "IHO a}`(p%޺\2$;:rBs k 8p$Dg`rr`LfnrERj}ݩ8j L”Lj=V,&[n2\0Ae;.GNJd`stABe uZ2 oW:ۮ O#T &z7(P(vgP9Q6|C lHIF  v&Y@z.&O8sĭHy5pM5H쫗@$Si5 _P$An)2EIku "WGFx.(-fRY%ڭ駄ekQPΘ@(z."O׎]W]uhLXK3a?s5&獓G 4,+ot')z *]NoןP͛˼w{饺uVGO}ݏnsebWKXQ^' v˫WOOpi),9;Ơ,՝_gM%ZfaBr~k$(%qM͜/ϟo>\-?75i=Kﭝ{x_jjsx5g77Wqy 5E\O偂 $_˾Yvu [Ogު+>Ǽ/qn=J04df>δ=K-sgGq0&aQ(%~az~nvF )Ƴ Vd{UB\D܈{S`vx*A  pxuth|:A+"NȥыV!&@,Ĝ&0zC0u ٣^ŒM95+'mBlT⩱B"#g*eas IsY/ Tg֢ xod4r!pOӘ9O4sx6GLc&WpZA 99p@@[lZ[QOU#) 3=au*E5(HQGݸ'E ǚi*N pB q)Jj1eԔ:*L9gYxD1%F8G!` `3?W?VbhUK19@U{7{ ƲQBN!i;9w/P' JEEw:{}ɇVGZݟ-WopO 2\?[P ooʵ531Jr~ b[U%h 7D .N [t{jBf̣c32;M_vu=9ڟOڏ wU);YeE9/{M/R!|Wrw%|WrwdT~TG)xo9颏 Xޱ#/CedΔPi~\aq8=y[WeNO8U|L%A{Y8JMYӐ t+ډӻ2` 䴴;xubM>C1:(h˧@_ac(vkȉlL<) I#PZ"sK,H1;gwLJ7K*q4^>ǺQ܈vlC&4ꠣ)bפ@M!Oecr6-_އvH횶6/XpK)8' Ji;}$X khKJ/`9+\Zr2YEHw ֊f0/|3IBф۞a̶(X;ytz!,5 IPOQ#fFz܎wQ,gܪ~Z*d <{%>dng@.㔖QZSVxZNJ8_Q8qq`{ 2 9џJ GY@n$Q"w2ޮ2O`o/!gt6df`??QZߠ]\G&wW};Zb&wвZ)6ZMȁzbbbVC(s>paDuVW] A4q.iƬmDQaY$&} `-:GqzYBL UP=Nq ;VҮN֤uBj%=Eh~Uqv_wj<'|^ΗJfuLTАFn3Dr$Z.1_Hu ,\cd1yA3vT]1rΆx"N;%9K90`!.?!Bj\xDIq[hO~atr7VkiO6RNC7l!4Uծ؅fy Z ̈́^#C[el9pEk3HA?FZ~4.>]hpoZ< ts~ Ğԟ%sѼ`֬9a?>lM&;w*TB*X}pޭNò[e%֘ Z R"e9-L"7;˃ND 4 . wSY#{b* l!_3k^JfT VЎr/i 9B#"=5 '7࿆0!cR,j!s2Zq~;Z  -P)k^k *XZp=*E[-,87%p(ay-E餪8w&˧_褰3QaG;F<&8n"pt >lJ-J gNCXg\FP9L špjVe7 /f:gZ#~Zsu IqfNdL k%S|8KLr+Jr]G@nj9oƊB!((=+zk HEKORj$< /E`G:m#5: u(F$j:rf%JGϚAHY["X%VD F%*.&N צ1kF6ja5;4hc T0=-P2J\c*>f]:t2-ܝƋ1. =  Sw7IvMS;8lz/iVk ѫ&۳H5N$U ڶ?8ITc;@Tǀ H<ɴ/P DpO$I|pHD]Vcǵq,qB=̓8m&elN1h"َdΗȾ0|"A"@޿oP-YH0Vqw*8oOK3IY7p'O#yK_uqYʺW V*G &ܬo״W'hOa92jz{`IX K)?y5X:'b,@)I{x@R D$B1̝.=(M^*$yuCZ L^iW$AʨPʊ'65C$&E-'c;ZNDT$|-SDB$kVbYKEQ(LA]D $Lhnaf>Z=gH}02mZGFl/ 6M%SWj1,/ǻcouj*ŕYฌ{be,h|$U>#fOUK3ܛ# `;}9"Zx j^|&,cM~xvkÓeA)*JFF5[=K~Ғ3H]:L{tݏcfqT׭r ;>mՄVPǀȐB'v'zY%Fe; On"IiXwRLNa+g $9v҄0"Vky0k" s=tR>9 EQ[wUVt=KeVN(E%vxX %1uM8z"g{%'d߯i 80GkL-+ Y<Ih"ߣ~X$%씹blq80=s(T<[GʅjԪ -696e5>ϡ@qh1xC*JDtL.?MW}Q`rJ;J$ '5A霈=:X)=NQë=ë: { z0I3GËcNqx5SxIRi2…սTAG+{WD,`mjPRZm>8beox5ղ2M!pF&  [wn? \iБV"N`5TFnbʹ -q|sɟ4ٵyvq6՚/a|ZyyH5!yQ8sjR@%XR5բqҟA$JSfgukѦ-ZD)˽{5Y3w Q.\2 wVt`]w ͈ËB1ZBv)T"yP `3bHQ(Y 2H YOkYl>0ƐY^6]fkb8mPHc`euZ$4v"j [O򃚳#uHݫMNjeϿRvx8rɗ?`{n'/nGn_=z/&<ө~ l:2C0kzJ| Ys:R*Cp}t*o|tU+~F竼ĥݙG俦ȋ!ژD5J!՝۲q(mn9ѽR*Хwi_YL"R&Ll~{9lpVڑox{v/&c{"9.rz،]x 5L|^} M#/9 zHoJFpUP¿@|N Pat5cEbBV^#&ZcB ͤ<--Bn,Dk0D>.4tpPd٣OWC`1IۮDRl"DfHo}ᏣBSjmۛRr'mN/RQ  M-#*Js @0X-RU)}^o WCb%Aan‚߯?ۇostfP$ fi_#OVE$I|.X k g**F6`I2—ȵoQO҄o>yj\e7+quogoXY*Mݳ8yޛk3jPSaaVI?myx"rø02bY&Lb-ϙ G8jpW}rI)X(@͗R9)~ԹBrqFA_wA|I"_b u,e4h/~=z=iY4fXU, A5@"f%4iEl>Zq2r&S2I1bG nZAMZV:RK:%.cvZ"~"$OxƃVZ<50Ɍ%!mwh D?նpU_xDwy+Om㠷wMC BU[UQ,Yς70:O AI%!JrgN4P&}HtkW%殖 Z9r΍$瀨K)cGwQˀ\[*rJarnhNCa44z ia<* =NCOELABrE%Vho4W:X)4qK(^)~LPWШ^4w)5w5hnMIR',4GOQ$Pmș{)%nxl1L@-D?m ?Xc/ g~zB8Pa $A1 ZA !U/I"Gr6Ws7դCg9Xj<:x<(5 Py$3ʃ&JJidkК,n?u/ٽZB8ژOr`Gc0*ƣ/@Qj*'<zDh L%PPĈ"Ti0!ٱp"ދDb(bB[YfOwC8B΢ =>Wp =dp A2i]IT@'3@"OJl =/+!!\DdJiġh7|$jiP":ktn#yhڭRևD9vFT!!\Dkd 81׶zw9iv8 "YnnF?Z8i3S{T [ 5Wv />5T}w΁d wOyr,8*B xP9DR+@;^+xCY^e4(JU\S6|l&=H3*|C9S ukt=ͻ3T {Dv0 8먹G5k{Ph-=z@" h[fbÈР&\d/uAfb>1agL8`ѕmMA0Z9ׯ.볒͖5niJ*){I.L[GfJT i7MϞw#\yQ|k^ [N1x9Je\ (*]3n|w?<ިyʓOr~%@H]/ "jo#dHLمrݜ9B,u^4cݢow9GNf,Q< ~??>kģwd"8nebY0ui$R&'r+V%/K!9xu0\EjzspD xf?C/~8T/t+$iXx:jE0RO;+ 'ʌԠXH˒dP6I6%}et#65sD@E "[O8)RnOe)&)V:.:tDd mU!zɧ\Ρ4Z21OiN.RXJ*@myY.7s=&{MǼA?)wRm g覑G2 6j J"ap20+Cv:qh 1pHTEIZ׽ZoV)Yѣp?f=\\^gBE! t'DdN& P= KĄ5zԉjrPV$+ )`tN(NYL(Ga@?Jk\ؓX2hB@D!Ѡ&nɖܷ?IL2Rׂh}C!l]Nxۈ <1p4Ip<q %I;OSTL9E3@Y߾ul1Ajrn0*;>Xe_[x={WE]?iD OϏQ!#?rǿ=] jJɎ~4xΫNAemj0L2h&F R˹I MxJ7[l?{W)ЅJ awΟ_Zx3Z"d| ǥ(J˽|qt=^Rv>bJ8;+Kn ۟?~x8ty7$6E>R֒Fb1YZySd(l`%e) CDQVVXҳ2hwléaZ_ ,i ۗ=%Wx^D}]u.ǞP0'=ٓV7}>=/ڞ+9Nފ.!g\RlSHOy#zf'G <(Rotu=mN@/*~>D?sroׇgç W5ܥ=p=,ÿq]UJ ȹ$<+ kDcPm^rJĒd#2h "l(}Lǔ lUyD0Ki!FVĽXzޞDK.__gܯ<ȥGsrtW;}r^_/:+k o/ss|zA3e뷐/yCknL&l&b#BE#"s#b_>#I42'hҶ>KQ1羍|zA^33)|kgk]@!5N hxB&4o#_z5D>4ꍦx~r :^<: ɮ;7<:yQ-7 x\i\x%R3_CG 'L U! 5$8_Z9.RMȈގD3t~w!{1+Pk]D{ ӋyBLiVtY&|e-ZΣJu쒔=+"Fmu8@I_1g"9>m^[{{Bq5RtwM耱":YO)ꨱγOGu^R'(gP {ΫS]\Hk4׾Fg98Y8vNQy$74m.eDFLB}0ʅ,]PUwf#bF+a^T.VKt!+jrq;]+F$mulطNVlfZͲU['H$S̓LoP|"@g|Wj%I@xUj ,F2pKw/4EF=]wYt)Zƒ*'d% i+KFr!M'I P*'x uNWNjg!*'U  2HQ7{O s?*IBQ)h$IE/#H] j"p$En7 bTDĨUbDy4|jd4d4r˗||Y2} *볐t z~#|۟PoJC@ Qi M&NEU`Vp8zѰ)Y4]As{iW1g{d| Ely,1 T;R V7:rˆ_ly1W=_E+=NuT6R2*@Z(^b ]ٹKA"-`Gߊ8Ќ ΄ӯO*P`@7~,CѺugYβ֝nZT,&* &4 ;g&B4R% _.Tu-Z,O!`LkljD0V&9ԥdHܦZ@ ̣m >דr߭Q[^إcE,ڔ8قJ3xXj5`r ؋ iʴ'-hСЉ'ٻ6$T^gKba0Er9 g(iHҌfN c~讪f[Z}[fZpHÂs3qx7"0Fja4 X2Yp`[i(Ϣ̂a^ 3Lp1ADg5x=2L"(Kp mtxiR\DcnBn"c-1*jLV"Υ#iLOC)MX@WvSŨ'nϭ>.)CFKő!/;VnB8r\FT#1b2EZ ИO !\{\P.X4 Dc1Mjվ~,Z)Y9lcUx;VEkX!qG6cXiPL{KiaTNoU@qܓwAoZ{̰<Ԕ/wlK۵؆cU|ڹd5 $P% bL"feeBZ.Q䵲TrO<%Rp!!KU:X߿gP%5yHc~1jAw3PFt+SW|s% 'u./:uӺm6,)ol5 7P 9N" 2ߗ#fIŀt<iR;ɝ[  L}7ianc% BKg3$}`B o >J w[|_'U yH;%(ex2Q U`Qg\g,n %HEŹ`N "!3e,bq`jN|kZf 1k%+) !AJ",jI4ƞ;G3 e%Z GqfdV,C)y zIx)!RMf-US}Az"ח ԫgoWE>EPG@h,`7.|/.* ?1-T0"#EGXPV7.6Yl}8^y%f;g?bFtDrO)2cd+X4pd$zvI/8&x{ b |=Y'\-^[ n'" W/ ݈ L11/cEΗWfz^VEJ`C><[nƀV]x/㢽mqÐ^πԒ;m pmGL'qx,fAd,7K*I 2#Y VFVqь]8TE;8?mƾ!=zQDŽZQf#I8;4C sm%!֭ 4Q1 dDdZ.;^)WAo~GC!_R<`&Z o"bLcjZ,`(pWayl{{އm$pH7tu#+B9@?_|t76sP omD{iF ܊I:pTLWGJj! lؓvu0z@kSy旱jvNDV?C旋ė&1q "2 &y.r}6_l1e&2\l-E1_Mo|2BlHH'|"|OUƃ͙bjțLf+|K'A[YvM] X4 =Jz#lM}5(O}쮔v=09)G 'SIS<O mg佨6z!fhlNJ~FJL^NJ}v7v+t~Ӭ&lXLܺ«GX_oJ 9^]#x/ioyWz.x`6DjS%ҿ=n7 /ZַvB8;"_c߾fG䫣߾ XTr!y mߋm2 6$lMEġ"˝"hogfO0kDfVJLdzCƌMP9iA1yMLCwJY y !"R-p! r@TE})ie֑܊jvbbc"P=K0Rl܆y wVXKq dg  FF`ST5R ^k%n4ũ΄`ENpVa]T> QQƕmL.Dx${<1$0`i*K>˱f"ܓ1Di n@#^)3(w*0aecQYL2YnEheoIOS-DR'!4Qzl>@v<Rbx @倰s4M=GSѾz/>>31v3nEб/E0lc˗>1q^ɋ_t*b+݊bd̟*"#F`hRkBLxx#8xg)of0ŅU‚ʆ S2CU\Jitθxj F5粼ok]~/r1_ dˈrdrYa`}/{yi`1+:{[<([ίCJi<+qJLffȖ3,o0۝%C0++ 鴚o-AK_oӬb| FQ3zʕm x6aXqQ,ĠF1V&>W<FOIJm,I'˧3K`Skܻa:2u$>0UZ;o7\ɷCƻ:XtZ,4vNB0;pxA*o!JJ+[=*o,6sw$hEJq:(EǏƨq%:c#h8~"8c9bTG7cmu=#|ҷǿb7^ڡqq1.|=y !Dze44V7O>e)o37G?Fݬ~`wk!XW"su6|4\Fנw<ʗ{v>(LV+VNEViEߘSVQN!nx-Jdۢd(5ˇt,lqm !ކ3_ڰ@")Z`NKPFInQ!bbA%),%QkΘhYcwz$.ˈ0 d=׆Jx[N%i#J%~4#r7+zySq,\ @  x |m#|)XkN$mvARwMPAi#8XgұspJ:vnFc;{bD8BboBݷ7Q_5E*Ot⿝z6C f6<'O%E45(=u3J>]+ fɧ;@ vʎSPھS&kF82vd$+_o3Z׶1O Jvv#84Ovv8$l.yح,nאָ}ĩ<0b &1<=Р.t ] kYO[ejv $+&O0' ƾiֲַzQ [ ށ\MHëQFyN;5_ޗd__Ɓ;uuT4G*=jbX.M;n#~4( fFxGtס]/av1_3="9(8[NCz1B7A`J1&΃cάD R;4w ~l.a@aŸMf>4P*9X0E4N$b &aKCC?^؈(J* ꆂzoi`R¦/WcB&Y _uXU2ŰDYq6wٻmdWT~894_T;{6$Sk&&5`,)mDɲMZɁeKE5F_BҌ><(R u9 ڡB_x9{E_o߿~쿚&]^/ߌnt5sh5:ؙуceUW~0g-Ytwhp뱟P$hͽn+|H$0|j5|x V3hnG9plf ]RkbI8aHx$AzJԤ kPj Jd"U ~cr)^gSx?Z =6|=^d`ņf>"Z{lE0$! 7/?' q[c.Cqo7f/٣GV8;_fhMһk?-¤ll>{0Ԃ:ףWZIJy$) O% A+eܑ$A0L0f7CkK:TP4Z52Um| 6߇CxԖRƬ觶HhkrWLbV-wu'#j DQ"Syj g: ʇIMƆPHxNJOX>p QM#4bdH1'4;ڰm&XSmX0 tB0kB1smqp@@btFa獕p ₊eApcQl N,xFYE TQ=tsQ}&TĒ-433Bwna#tfyј3f`nh68P}g+XRkuػ~OEW<cCPmqq2M!\ʛr~p2|6->Lt8]/>\Hv cC5\H2 R= t8\,(6|) xhm1,DěP t5uX$7u?N^-4'}mW4RT)6Sb,qX uX uOtwԿJ x,變21֪͸ښt~!$eN-x#uL(x3sSTd^x6Z`(AYUp8`/f,ܪYhn,\!o/- C_lov2..<誸_T7u)ʣJ2~/cf=$'8e1#.`2u<} ۅ_`SƠtHuI,EV:MyIvB/?)LCvN:/wkj&M@Ņȗ99Yäx)~fdaR@FE.e;;IZN W؁3K{.fYKbxYWݦ:Aɹ$"a}IV! O4| ֕rS@!vpw80,6I>EyP^JwՈ@&fzMx[`:*PǻX .Qi^ ,lu.~W_֩)V/t: /T-ݹeyךd΄ۦ3L5X:LgWQav.ޜfUvrm %ڒ:#;u1 5&wC6JPY>أӃKb9˶:z{];:=Қ^ ƒcpn, ;P:U_wezC I;JTvqd,ztԃJFppΘpt `tGY+E|DT,rbוE.H)R-*g* "w&e]0䪔l {ҷf{;Wk U[c 5W?]r{4Ft_AbvUXzFLb\vb7#==N 􋢇~9U 6$*l8#wǩI&>n1^_n{3/z95_\ͲFSlv '*$D~fA=ir\s C=r>NǓޘY5|o+6*_Ojjt3 }w?1nqfw  lpL}g>u??#L߼ZM&;X1_<ʼ-߾9lf ]RkbI8aHx$AJԤjPj Jd"U ~8xrv3Mh'0x-Ks3z࿙[aHJps|X`5oM# 1FEљw-QnQ1e YRFbppJGէ dp< #P! d@ K:j8v3>NodveIYѱA}uoP?6(1rn@m>qjS9W\#UjȔDT5q8%q0C%'D`yڝVRoP%qC=Ȟya%N1Hk빲ԽS"iVYy嚱92? mz:hNH>mCIP!4h(`k  Z ˨Y=F]q?mI&KLSIl*xM`}iEh'8#H>Z|taPJYb0!H16:N)є3S s$Uʦw ]R 2Сfwq;^:HC7zҙFD kƄtj 3Ip8('ᏺ5\C3'3SO}-_p~C>;e5&j=G$I@p/50a@ K>{k>dE:ɡyQC{^ڧ1Eu"\m 棷ٲaQi@}mSbPR2t7{7󠤍e[U&(9`O4l<)[JUPP:{($ 5O:Ѻe^;3/ t+& J+!DM^iI[{ (T Afnf :Nx* B& NE&;oW2/{6_!eg|@x`g /A1E*G߯IIDY|5) "U]UQ.{ĮOZZț͡LKםu۬.cXHy !# &}oq|:لs>k9 ba 7rg<4G3B( c f#'{g8 'EjR(du|cW*$pӯ[} ԓaHIQ WЖn8yBb`Gq]$Qe1 ؏* i ^ݒT =*̘2ӱ~ng*ș`c(VVK! нWȂ,-YV4H RXn$ 3ہ E?` LZMr P=Ψ-yqbs{dmBb`p˹KTX`ʞW1Z|iu6s&\UZ X Kh> Hm,iD .@ x_[ @eg\rYY aT_l9JϿsg۹RmV ŭZ w:FkEVCZ3P+ ɒTjMԀ#@o{[:ȺC<@Vj w^Ѣ9J8Q:RiOT<6"`EҎ/= +_&tC6 |TS 5Pn+֫2ǪP@n;ek&0!KE^r*P,nj/ƒsuHgGQɸeA Jk<;4a[#.ھXb-lUe{q}l*vܘAŹ^ӫČў=d~i Cj D,VuZLzX+6B bAH~ 7cZ)@ҞtKA0\ldo+ܒEEQQ T9\1 `SJյlǮwO55^]f8m)4Ś[ !:JǞ͛8&[/q`S ;=fNԀ\ |({о݁ ;HTRNמbFo q3U=pv^صLet3l+$Q؎2'4qS j)-% x+fL)=k>񃉻u[S\$9㹯.fh) iv>0Ǽ9IFKį=rc@3C o~>srAE0Wt%3cѩ5)~㱙D ;t zPp(fݶ5K|"^K3Dh3ggS{R1C)Qv.{XDI %e[7&"4i\H`)/Q4^ܴ@e&)ss ǥA P1U\LYpoZnm kJiI48gxazd!>VᐆT1>3AZ F·D#{_  |bϥ耲ӕǞ!ox>H#NQH95.o.o4VߣԬn`j'aTRKu^[O}وNek=SyTٜl00밞ӖT3Yo5e'$ 6@m; qc!`pPX y /Q+HI|P#$xa%Q̩TyZ4@%ep}綌p[VǁY喴+e>1bJ= L|!!-ROin|\mA3#E- 48Wk6es[|JB0F`X׏\@;35UM~A9D[h}WD(Nƞ7߉Y$/so,'PL@HIsQWt-uO}^w0lśow0 4ݧ߿MЕj_ |y]R|[cabX.v2*RSRB ! ?кWwؼU,xcu/ \vTi'/n7iM*/02ӮWU,'.P^3e쏜 cpM $궷-a%MMt 5~4Qƙ] 1oSeS-BE݉'C:^<-`lLM Ka`_Zo%LYj⇡?SW&a }K 77 hSmڨ~3=0zL(&7QѪ:0{@grڜ@~o6lPYc+PQ:H(Pe袍Vp[Up4ٯ&JX8MjwLY?pfLӻu4h$rdףׁqm-zb0 ,H j#!I0 `C#0^XXL[ g:84P}xf= ]IJ-MmnzE`|-mU[{ xxƾIVexk%1ʔLbPlF¸Nhz)Kc,WZEeJP 鏰B==,ײZRk׸/-wgDG'%RVGCt@ppJ jgYW#܁wZmP~!8;^Gjs}hWZjJ;1OՒWR2zZԤӇHZ˹([H붗!+%rb3DwTaQ¡pn)M9LR9+*d{Y| 2Q?c=%cR@-Ҍ(Zc,fGbHi(J5ɯ2NO?=OGnXd'%a؟<9g^Wɍ0VϾy(D{bJ*9y;#X&)H;٭-?%XdM J~2A^/AY,'Ww})!+걣!jSG#ɌK2 HvڶuJ4 48N~Vd_KEJY`Qi`5ZBE3tKg+E[ t 6p\B̻??zNZ.\e[TXُ|hSN%<b}d\Jh9Vê)}콨aS6k~LVUZr/vOzn& ۝[ beKtoc 苜7Ry.f7Git7[n]wUuYdx~n[Μd>5ag:bpvϢpȬU,w  C4VP2+i\庥;-GNyu<~švo{i4t(y܅\V\ؒf\\s*7O*6œ ѷǕB%h ZP0N.Lx1)Ps t=K}@B}4 \@$BT݌\%x!{A>V/}(ZQ5AdcdžM:@5 zm]_|ت,=9j<;KI 'Qrc {SYi sow&J(}aX<Ļ+!΅{|' O}gQ.g::S*'Y Lz9.1.`,1D.0 # SuA Lǰz&vlksg-oIfJw'6ՄTo}M?W΃o`n4%s8so6|vUƋɤ,ll=LDb(&>"9. D@:p>Lٽ2ny;Q}y/ן>zOo]^֭H?>~귷.?_+%vVck\'7= ϟ?~|bL2.+W?W̨:w~5ubJaXif^c؃px&%Hb~Ij Vgf'R/=w=|pqҪ~ެVb`"$W?`=1Ҝ?]@FP]@w_6#__uandFz.fT[rCͮ#pa =XJTddK)efEDf>HV46JSFt3CV՝;A'}'}'}']EkwXSKVc9s4=CR 9`·1\ 1$dCJ$9SWWWUQr)4ҮaɮMO}1-{|8ks#VVGo8"e,ۻb͖|)clO/l`Mi']}0q~4~B' @tgyC~pS"&]Rc3/A$}+\|ce/ /FEz=&]?rl0| 2bYb &JI ɐ)EBY& sP  a*22R1H4BgH.^X7 dyEBVo%ǗMy%@A5]9ոTRwdU\E5OX_r}/Ǎ gr wr9e֦v uj -Eח,7I3o- P淧j̯w~t Ͱ.}7e1MɜyrR{&D:2Y9~=jpF[߄k&Z0 id\V,+3-IqnS3,+Նdѫ>~Z~zwF%E#~R4]?"I]R C{*̋$y8`z_m:}dK,4<Yo{7:gJ֗ 0r:^]x 4]}F4J/_B Z;͊΄~ʳ=#2aZ?-#/R9 q/3zY3xqG~8qga$ZBGJعw60=psS%oHSb:2;a+"CQїmeax9'9&QTU_U`M&x+/+My1qm ƖV)Hg0SNiA[kX{>U( !fVo<= mX*x罴P%p)}gθ` GQ8̄#Ɂsh$z_ Xr交D|Q7c &G1Wu *J:@B*(D8GI Lެ\>n.k`Ȍ$es%PRO&Bҝ},7AmNPlOLhJ2#NᤁP9GEi5Y µL&Yi1fpORE&pBCV* v}[ÔNN5L# "Re=)cqY(%a:5k-wI+ka.wF*N{hXZjԮ=UM'7J$`mAd=U٘'/YM$=oC xVӬ Zm$VEuHaLa&-S8l)ZJD0LhhRk8ݐPn5_oEUnj^u_@oGcDH'xd6f~PrWwIбSiO飷bɁw2CQȦ[Q_[]W%y1BqyG02o?{*It 9{tF_29WGezDjc2QW=ۦ湗KZ A!"Ru]̻ެX&\92ˤŇ7>q8 (I\vd݅nP/Gumog`Egũ|Z9~g9m3L@܉+8YI^c߆yPBWw.`5U[]H J/ïkidΪNaD5(`\rC綅[--*UK ѵ`6{Apda}zUȪ޾ ,]9'Zց)[ͺ|:BKA湻r.W$YFN"I Xw CYY.i!(\dq-~f=_5"7"G,eUm"R  + d B^Wʹs@F;va{n'k[ppA*p, w7_I5"ʺ^}w[{tȽ@? Lbw||054)%|{1}xb?[Fq[E!t٨_&>%;nCNe sHo3Dj*-t4F IӔe&59Efq'6  9R,m7ǪJ^Z(4qP3@)##;9)6sÚQ3D ì~^[9`$/dQx| 3zZ+YafcVA1_C' @i'_Kd zl* ׃ʫA<4^~׫+ ӱ]7̇X@8eVX?BrхG22X\?0WB,Mc:˴tK=pz:57o}[4cB`\hpKfjA21&VP`J3jx-a0m8#Jy q1l5o=fi#;rۜ|NR/-JLYpCȀ'FV:e8EG'Ls&mzl]S7_L'fJz&HTb#9ǑƏޏtHF{'v~yH$]~)I1GP&J-P”ƙPJ2*8Te;dd H^y5 qCY씠-;Ez?o|f6Z9z\]7%TRD]x[`BR!LfB4S֤Y.jSZ0%Nvi*3$m~[dMm-!SAdHaͤTnI[{ (j>8R#-F;;FN+U9ۭI|ꇥ|_7j0 #lgX|=r!~~z 3!_ׯV_x(u0ĭ%&1y~}y-w.wۧ]7W[,b_7t6`<݇qI_cے]I"`>A,~K$R&%sheʃK%ej) pL?2!3)BŶ{U]P >rJYڒ[,%[,$2-cP*5+#$,ńYj.s?vFsVVh 1/jJkpiG:{G5/w\уaGb'ʞ C)ٞql{P;@tq8,X9S}6Fb9dye:dp0eq`qNfswͥ+ciX$ݧ&A }7Quh>gd;cko-;,n;if [xˡMl~woP/̠rI:Nu|7,F^/bI;ȇ?|@^pa-{ťfe M>9ŻQNox{7nݤh,dfT1΍*+ z$8Ӏ4o!ie݀XHLЄRg8W<6G&4I4~ڇT0A7~Kll ((7āpJDo>tQgܹ'q"ǹ)YA;K#W{Q`}wU ~؋@ErE];Jv>:FӖuTM8Ʋ͑%śPc,q/XO?Oe2vc0>-,v9Y_ *MPQ< Y: ©cH oL55>n.Cs΢k%&Lr#yƵH4#+g >;b(Oy h^2wuý1]O_\_ wl@B ɐB)1!WJX շ Lo0Mx d_%paآY@/v)ʖA5XҜ`97 l֤ٮ! gb63O"j2i5ͬP:R! c`oR3Rg Pqe [KC\Cvg`e3'ukוO/L'N4?!Gd>uDb?kEt0v4ę#$o@5ky{l+krF_6d>:pX1g^4Q1IlMTQ$P:DQvj/d/6$x2O[ 6ȃ\gAUPqPG/M&_5KMyQ6}H |q/a7LVc ڿQ'|rkKs< 5p<7wA2 *^٥/ Dܪ.p|9 o=lc_g tyOuyX^s`M(zn3/Of0G-WYUlǦg(,^+/x3❯z3P'A,+7=<)r+(3Oُ.\3 8w&hrSycˣj ZRپ/ryduQr,Vl2YƆ+{G@$ >PtH(M E2TWlO!9ŧQ2YlU6QMz f5xVˇdJSэϺhc*Tm;׳ߞ3,6V~|^w4.;ȯ5nyvQoOO8xtThmOgx!u+a,=9v{LP[KU ^oteuWNA[<Wn`-ԡ9FKHtIS=V4t?t\Qdkq;ݘ7s| \c4in:9#+Fȿ #)JqpkHd+zS)]OۓUq}{ƕ,|Y֌&5Jc5TrSQcޒr)N:V.MW^̊KM/ӮE,/q4F ]\.bW1}Tq 3%T{em=!1}&aW [V$liW2>$,jڵ+kѮMu{IΫj_^:d*:KqoY鸢A\tӯ|\Ɛ'f%ۻ;9V`ퟺYlVƼ?Gzwz=+u9u[PvxV\w'U' mx yy2OhӴDw7CC ffl{v{I#Ox t[i\}.4n;ϛ雁F+UoKo 'GwnxNTF%8M0VthI}Xݻ![n>Q*]Lc*ga zo9Jq4OmOO!v,GfRnPޟѓ" `P̟W7ߧj~w{#Aܧn?tNU<v3=~J/04{^g[, xƳ}%+DKF'\,[O׾yQTQSlM?%!HiG28)4AԟG߼X#s4Xpmcƹ@cNwx~Z9cvw[a4f&/ggZSCŋhV8/6W=EQ XR8(YHiTsw{@].{_kSݙ*UsLR m Kɐ׽7D*aU7ٗ/ |Z KqX$aq%S)Uiީjwhy. ެVZ,Rx QL+vUcїu)*>ǽuZU}۬z꿦btk!R|Ÿej/>$VlVRcw*ƛM+=jKLٽ7{Y_~W\F:ݱhl{ͥ,a3^ﵢlC:}sJ_N^3bL|I'qbqEY<` :[}ge ?dzi:zļosE70)}6{rUpd+;Tb.{Z&a6M@6<69ņW~_y"1t|TҤ;^45uz.VCtWZ2%3gFɿ\vmFb63C`7#tCpχLDPcr]J[\I5(#i 鑡ETDAgqBջRMC8|- sau)tþ< P^X YYW⃮:j3c(!ԍ!71$NA`N*HI w*5j?W<`j]67+\'iuRl(꿪9JU˅ +@{֚AKaȘ0]Cy!B݄R1`Eј, /(TfBF `V % cB1Q%9n:V+1NLd!$QG:qAQfe4gK-C%,U hL!Qn$  %oOL 0Vm'%[Å ~i4\x:+H IīZy\xּjӝxxՈQ?AcNZmsju*`sZU'3:?aaO5@\P/Ts{ K *9GhyEwiuU FRcGJ78y^K|Zi.Lr}Z"AJ~m35_Y,|XWnmJ#2@ u:@b$gHL&`m8|qLhlDH p"^ =FkNB0 TtԪV&G"R4HYdH P&iB5Ƽa< q+,SdVj^NEdZ.\f9@^YS25--15mԴ@V7m;5m=.(&Mʛ'66fjZ!(aH7Ym)0#<G˽s]\m#=FQ&.і#BAQPJpё"'` PQt875Q\zoobjZx%~|.4\mR8#M0<)7pJ0 "⒃UnǢ&jWq8ǢNdMJHʷ.RLD_"ꌢ.[HvQ 1Bj`aZf2X=KUf6vQhI|8,^sB< j³CS^?wL!~z3}s7tsSį7|J`[8L=/&gHYMe.3O#[$ Gleqk/ވv#BDKAڻ8eP]OWWc`j9J͔Y:i ˱λobWk>@2,&[;;Tmů6LdQ߿<+ga:v82j6IPi&&:=]^G7q|ڢc.|Ki.bhMDMα}]eW96=$a[c NHPTEkb^E%:uDj8{-)&h%23Eyf5C8#ac- "zd^<5zS2ĻsV^nOV^qs*9gX5]_'+Xg5={f"sN~xˋ7{U(yj]_yKgN;*W_W~sڶO׀3TwkVOwNnW˫(cju*j-̊( ߈~;OOj[.EplBDN&K5>$.ip{ zUau\WY.\˅^wu+u?'|%wDҁNdtjt$SgF3JˊCxaE1ӈ@3Ns6k1rm|F|n=3[eBHǝő Fˆ igu302*E [#Ք&Hj,3^," 3%Mg@h$w#0yHb̰`ydJEi tg '6;Y#Uk:^x! 9G[-$$) o8) :w C`"X0Ǩ:bֆaSu&MNF)te`U=ŸǏл{l42Niq aʁ-.3%2f$m0/RVhJ SH7ۃLXr/_x2D ;=5Et1Rc E܊!,E1A9U"c-?K M^ɽyX'Ůe^URdqacN>~ F讀2єh 5 IꁅW||\dfxmVn @sroRTB8&\ !KC0*rVp )fh튘2U:Y<RaPQ-\[*`**: z!AZ!T0$XJR.q/`WVԭ!׻OaUA">}Qy2ydivT;XzX=׋L^f-|TomܧXizㄠ&B꺅'ޔ޷\"[rnwʚVS* -aBi.ZuŇŇ*K&* 0`ˆJ)E v%JJ)u[|%d=Xz X%Je: }A h%cү6Z@Rfwv;3H%*Sc`>Kiay%+4d]$7̌+@iBx88@,gH2EზN1%7D)Tv0RNY`=o< 4溭lG T:JٕaYEXOnj vgȯsRw0C_̏|3ư$K$ZMQ!o"f1DFPBwd$Ꝁc4I;dJ`^|)y+^_1~mc 4'zpl銖DMwGUs{a✼USZ5 Yzi(ݗk2=th9uFݮև2O5E)#N4S`A5ϓx`}ֆ K oeV(g{Xy&e9ޥP=UOɛYRԦ)Kʾ%kre~i*i<Rw@׵jm0yig;+9sN'vGR5p5 -SϸF%zoE짏T9(OBtݛ̚U[ʬY5A1/Qdv6<8δi/ryFe_;h`F~QL+A!M0 L@o^>h3ɍ-T],50hu|edGyɖyAîDN;y5[_N]xXTY^V8&0,\Fi탆94)w6yz֙7f~ $/>zKY2u{s>Ɛ!f֭_>h39|MϥV2Ċ*^fۧ ?,VUmj>ב%Nb'4$NY@zjCSJHA՞766YBt+Gg:{վ߶Z vV߶?V/i\]m*˜tW)T+lbֽc==9_O- a՝0L8< #t׫jdDixU^W=mk f8Kt"CŲ9auVZvrjȞ%Rwd\Ҏ:ʝ#?nͅ|8K^阗z.MU(4%kM^yyp.&qxqLaAXsdEJp 9Kv8^;@wdE|M:j~m)@HH D>KN펓fQ,rlh̪ܵZUbx!SkCzDŽwb^%4.|3>" [`t!!4V2\*jQ+FtQgI14;̃ysziXDvcn|9|?vOGCSϻT霽RF[Olm9Uᩔ{@<=w\?:=$9ld9"Ya:YX =U\ΊLF ng"$ځ"0P T O{T  .CJt`wKX~xpwrHːCIड़0ƅ4'Cݶu?k^>G7p Be 9:CHKz%iۆR xն19ƳV˒=N@/l㰓<3~ C<׆k> {\;W] xhQ3z@321)ڧ`Q]gRbvS Oz{%4cx/'&U@LnmH?AO%M)p9}fn9` \}0?[|6:eoM%z._P|EJ. 3DŒŌ9< S|F̼0c,F֘"b=[ޠhz˺rYx&twwBnNz\ŏp} O@H5GA{ H,/+P"وI `yp$0y)aj)y`kIsޱHt-(H^[0!N- a@.rz؃XXmMvϏ7Տ;PjFSxP*F=ZBI4.%7Kڼq lJg [yyR4߽_Lw.0\Z_D 7}?=sƚ͆dۤy)QC՚*Z]*p]`Ņ ak-A2rE o8pB%D D.0I@0#cʚ8_K$z_X_qclyrzEKR$) ЃMYeagnm6z0!u2JC bƨaa[JF#vAxT,B3S;FM)a iPCi@xQ㕅ˬw &"0:}0dkc:;B2FμYms51L钂EdDYQRЁ i= m qڨ^QCf'O(xtSgi?i!Z ))E3wƷ.Cq^8=9h6p4M~o1x۩y̐\gG:K70)?z2=2*۳u?+|yI/l:Lk7wGUy6̥g.J){VN׌f("L j(2=Xƒ2M9_d]6 !\DsdQ}Y5eCZL jN36nlдgy\Bk[$䙋hL yղU=>[&(v h_dMh[E4GG_U_nr[.uD'XvJO|lBZ&$䙋hL MfͺaD1X\ N36nW:Hwmݲ kݚg.Y2UfX֭cGbPGtru;`%pYlBZ&$䙋hLiyj`ԊlTԶp.AV"]m`@Hź65GmGYQ֬&pOm$ZavGj6{NN1EmJ۩Ҥ&PũQäv5] j' T)V xWR+ҶZڀ&0"i™Dme5 Rӫasme5 zTm+kmeQMḞqq[Yk+kj*kbVZ WYڳ -U$VZ ;Y&meYT֤P*kRqu[Y{5 V4A*W3'N$N/$u:qyjЛǕUGϏu٫F89upN9@hO5! $S$ e#RR# |턲 +-IzrB8",SV`]dBHChvK rTtzJPP0jO8&-1a G !_"eAj$I5X!#JSHpFO].+B%Nf2gE'8h&&ZXT!Hs6W9=}9Rc0Oz<1#ȧ SQRV=BZu)5!=HP0Rj,!@$%jᴴ !Qd'VN-%­A(jB(zAOʦp#XWh2R$hAQ{*`˦AY28xd7Z&j&"9lh:с+tӦ/%AY/cL}W("C_ٴ+"x]\]L6~QxuwsYM1_ˋ; &-^x_Ny@E{O1_)Sx2]׳?|wށoq[ʿ dzwntߌ44GVrkzuqsWNyO!M(]; lPړ|UP;(()*t00'T(!KC0>'DlQ_ \0ϳ5ӝ/S*\b-fvJ1mv*~geLAЫW1e^vG1MqO{-py~-Az@Iuc8zTzX(QrF{lBNn8?*XKo- J! jgEDqBJq N[ 3d$gPf]INo^Dϔu0u*6Sw[,_QV!M_K&9&i2ڑtFv{x1JfuIݞI;w+}5NXwqMcBҮҒwe1bj4x΋& 8kyQ/售 GeU]/.`4;K' $$Gݛo-gJ/i>T22KufwՇL!^| R&Q+!eSC 0ڮ8OOҍ]"* l&0%X8nrøg8H\[Qf =Ĩ ǖt.xi@f GEiz;x78U{ŏa+&80 0l !&!d?ΌAzn@K#55!b 22"v?ͰYH>[7O]#>})}[KQ|qVo,hit xI%2z0?񌣆A9qSQ鵻1̚=W)__> *e2 īxg|ܜ=hl͙jk߃+{]8 ;+w5\-"yŃ,OrLJ9ޣ,GZ'yz޹:k$ZVT~t":SS\M?ɏ;% s ؒ|/eckAWS`a{ jPO#O,v^{!9cx&l˕ٜ5b_AJ0lXppDU;^9aX` :&xG%9+R=09M˅|pǫ_?\AKo ʈD hRƒшF#DRzoNpsQ$"ސ\ZV>eYaM.;xZYĹE`OE`MH3RDk,c*Q/F8NOwט+ǵG)Z~״%I<$ ͻyc~?HОR\f[L9,ufp0:7tx`_d7)"pq4,yhbd6D<yxMYmb`J$J%?rrWR}ں_c:AF6)t'!ʛ ]?Jk2]|}e ӿNo^_|Yf Ui/v  5S ƣKm-b:[eEu}*6A'Qw~˻L(+ݢu^vcCcC4 -sט++Ιeè뷒hYzOz(`bkIЎ8@t+ DٳU+,7*h>+ߚ^kgA)tX [z4Dʭ8% F"IԤJnYRGWOBKǩuEd. |JḭJ;FM܁&(12 H nP2vyE-"?r&Mn~{tHDp5UTic) Ekh)RhJ Ryb08JEi1gPv1“_t2\%eJp᪪ qi9iGdNt|z:6hG1j FL\2 qVRJk#}YX[YݝOu밺6̧M!WW6/aEys^-3lBxv_;6ۇ2N!8@oYr T 4d%J(!B+M7'_|2^ObZ?e  BqW6('q]2U=6e*ڔ<؋J{u6ۄ3pV؁) ۱#WcglikBpbᭋT3#L^pLYcA5Xa(iݜUq.gw^ I`\֠h9Co2xvh .֙ IHVH҂alRu#DOJKJ2|2>j,**8eRJ eŕ30O]Dڋ31͝` ЛNHԑ(}ChT9 HJ1AohΕ8撗蘷&f8"hF;pl>cYpa;zdZkifQYŖoSwI4mqF@B֠'l5-ؓ[vuk-$S*h.~B(lCȚϭ2geʰa (ʼnxq ]OcL YR #lHkތCaBN-ZK)9XV%# xF59\Ɛ-'˟S&<,+LnOOel!L Ԯ8/-B8C-(.eN'G%G[Gcr[L/3of|}8m3-TM]qUƛ9S6xo^N"7USWC|Ƣ{%S l4j>֍.][hL60G K[ ?ClbA/ѕc)}x]/(6'p2Af|fjW\!kt=+F˃#] H59O^.F>|e͜E/Ncņjꈳ?z]8R~*_; Yxy.ƷFiZ|{AK0 ֞&ۋ S9uua~gjyi=߹X=o|`̟bݏÚ6;jBn쬟ݓ;,ZK_[wƧO57u7># l?Nm9A !.:?(%נpŠbkX> GւqOoP H[*NG) m)]/h ۭ_b{NڞhyS:=EE8mmب?ZlBO':Vu˦uJP)5"u5Y%UZZ ]8>$Wxz|HIiI~"E[C/9Pd}ןX={`4ݙ/'5dI'5 *fFo' B)coX$x>%mmxiR9QUڵ3D nLY|t^# 5Z@>+R@_<2!t}KbOޥ 6"DM>+Ihlh=%uؑlhGF ND=,C!dܦi)q6FzЍD|b;A Zv0KV0VVzB G0ܝ͸j;k\Tv8\_R.,:ǙԸyO5zt*;2/[>[|Au_B]]Pe R|`Xk_T/X,O(9˵f"/YPcZ0&9I)ˌA &F.iG9P"QSNhVh d#Ў"w%5BNp NB`&)g,wDyDW;>(*9jՔbu%)KxDĐ9MZbR>$@+Cy"pL H\RJm>/K2^J^r9z?`r"#1Y2Y-Ptgoux#Óx/.';v\qӗ;S~.@{E/Lֳ%zu!QuuWgW+^.wpyy;GKF_P ˃_Cwl?>$BWu^ۜƟ?>0Yʻ{[mE^ZŐ{?"ak&es5a%Sh i 4f3$W ',ꈐAfo h`a 0<ښCpŭKM!G$Fk[z{YzЛ1CO}S{-ۘ?[#?׷rd~]rAI|wb7ybY6Uj7+E! Gm@ڴQ}-C@M:ieOb}Zwd;o߾8Vxuq0;m9'Mtd'[58W똝WFͨ{΅.L,nBtn; (M,w~~c|$ Lk^p C%c`@\l\oJ /m"CEj(jOBj*1J)贮k vGO/FZ)L-FUgV7y|"ͨ&6E\˄ixǭ-iY#<8%bc fE;rlCjn9=nJpj6n|P0zsk(6󨷩~N%s*Sn*=Qj  ђ=XpV\S锒}AEϣ־c2a_zIjMͧ3@NX_z41?@fZB)G"8*>Q Ml?i)J+oDoWGGέZdß4A"۟?^<8~8.skɴ ~urBd\ǩ\{9_Kk)\B aǥ,[%Ÿ哨G]>)m.C3 5őI,8 NGakX$%FrELPZ[oYV҈|DŠU%2\QFp5Q$SYk=Rs:RKɰXRr2QI߲v3{f|(X(]sFWX!3+P}V)c$'8j c^K5* / $R,$_4/T(:{kwQ'!Lb>BΗGVضQAPMTJ8i[:v"gcՉUҭD'Pv&s7yZK E-y{~S*6c%ÿ'hC9u])kΆ:5;9oGLMŃVpC; _W㇣|&%6P6$2{AيM7[ZU&(9:YY󇛱VLFG x+RzASRڵ&s%RLQ[J_ɸ뢈fI'"!DrQD}1&L5cȍ!~0)Nawv S]v $T{HكȀ}jq 7%w F4CTi)ɽhErV:d.M_Bn3Z'WvA#/h6`2VSV[ATQ5䕑V#1:PnQoKI)@7iP@&XR)j!%Joi^I[D~(ܗsۏ3z&RCJ՘J=̑Xbi k*;cxTPtY1O~[ O ߆fi6 Gu{mȸB6k:R GHjЉ_>{& >S6lA|߁EW\QO9/.1\ GyJurEھљ{fcR>GNwn~.趫8r.9vs@ΡsŨ*!a7( zŖEXTi[SQ`QNtjf I[wTcRe{/ &kJl5ǁ@^p> jR@:9_l.|}iXw:j? FaV옾Tq+uJ1]_Q zWtVp0}77%kYo(Ώ^ٲ~X8՝l m UAՊDul jvxvʺis Wݡ1ᰮNlĊ PVͮ̎1Wق)떵n F>akAPlĪ!Yll uuf\^#)@lLDkqīƯp(˦]FM}{\0$|fe{@5}o.Z|&W nzqkBGaUwD d7If{)@ʁj6 wcQ54M|R `E?\TjBeBɜ%Co䙼zM\&@攀C9m(C`Qy8a'7Eyg[mw)O!.] i%< D jʠ< Po!]BE+߹)Kg| `FϾ%7̰D\HJY>@`( {D$^Y"kq^XbMv*~آ>NM $(&+.5!HXg+J&Kkp)@V.$^&.M4D2A5ڎxpȪ 쌈@ Ҭ;Q,l)Egf+l$cvF@*Ϛ*VP,){kJD~Cc9myjh9(ΤٗŖ/Hx3+*3Mi}{gSiePy̨Џ viEDkx*G5q|~Mk3xaGp)fPu[7l->iMOfL7o͏#ퟤ9qy#1̼.|$DT'鷖djIxd?&)t3h t`Cǎ:\sBM!I;0-y"썎]:)`-%h(Qo%O34&-dOx** "I=};(/V24Z _ʇM20yʨBgpr1a=%01bKs+ WHS M+ߥ@r򽔆?w~Jb0n׎v4L`F3d1kQn#LȊT~dҖ&g(1{3n׉ %,dN,d۔%JKslɵ~Mo((2l;X0~3y!0lz Dn,J6̲v(593Ev1z3 (ih]\(% <.cy O\,~~-2~-0ߗ,7f* xPNBUR!A}deNm.RĴI?)lK-$ݧʶSBO%VIzp@ƠkS7/Yq^Ë\1h  5t\V7y5pa 2ڼQ\m \h !UǮȖOK]o,kԺpmZʦ:I.vσxqzwE/o냏bxDɭ2^_!X6-\a_t ôx;/pi\DW#RVFw( &Y.RZ&U h'7^)xGT7MBX`oj`  .#g.cE=,&@;EkK8 xMu+ ͦ[N35YD!bkB 3\(X{#05a9.k=)b-j<1Aq5 E|;@f7u=XãkVqYˆcD*fj~ $j}S& %}Q=U2+li}b|tmwȾ&'ȵht~2znި-㏯#\{ӄonjS)[4NMQ> EU{ޙ(Hm0DMlO[~*<1@ O3d,$o}uN`,;;~f3?]=ug՞E(ƑPDӋ}Ofi@z:׃Qlo47^OۻwקXw?f)}6iϿUB\Xz&]҃-%vYa{:\~޻M̍柳 Be5bΪ|%+UWo/+<jm>.MOhKˋ/oon/oxfu8aZB[ow`RPf }N[m-sk՞q゚to0zORKb]z0þ=ه ŋMW=; OJn<ۯR?5qYF'ʿw:hO-v&N~5 }M'7UϪK\m-v$7;݁sՍ^|S1{!uz4"&|,M}؏PeZtéQ̚rs{G8xROgz$Ofdڲ vkAv8~8,`xx ztx[0U-"퓯xC!(0ڢرi"ʽZf/"4Hm)+z2Zz麃h钟U!]:iRʨY=^OG[@{O^7G=ښ&<ҩLzϿM$6 u=S]a`DՑqml}h-t*7&w[uϋ2\¯]kb fV8q?0 qIGe|yg]FƝHM?8)`ZK">du}x)2VIc|mp:Imjٶ :e?'k@s"_<'1MkqܭSS@$ g}.Rzo5Nizt>)qLe3!oJXޤV&O/ph&̉)AI 'IKL`Bh%ĉ?l o}-xXà\Fh&2\vr2IHpWJ仾} 0ơ&Zc.0GZ.soeT)!.jT:-!MҍR.w[p]#}dKf v$86fV0W B@@cOc b2q Pk:Ra]/3b\ŠA PQ3hY}Rw >{"xRHъ:5x7ME;Gي$_ >8Ƈ]\J*a?: hٜ#+Ckw@ l(,/}{Gd}+jb?NcL ﯦU6!]FKtv F4pf>=. O@1#:c_kknŗlVe)IbWϼdJ4DjIʗL% 4Z$}Oӗ_}M- dgFoO Ɯ(-08iT."0$AAXj(oo6K7Kq$w_'5Q\߸s w22an?0~59B1+K[e)])ͽ}D@do%߶#簼 t(v1-Z'jq&cn/^wE#s_Ìe1$3^8:%:u۴~3n_E(.)c=̱,\-d1'Eg"XeB*ssXr.99=kJ%h_Le*x}%j'Z!97("Jp4D5T֩9D2Opǖ1W\04mд֌&A\^YszU6rqP2JۂcƑYĨ8(6:&X484} ڠ[xc}. 1|`߼hn"I"bnBT,Kˤ$ITF3 TEgK 1( ޴q>:-!y5̫æzbg;yPW]aȗ`_S{-k4G0ՁZ뽳\gBmE,.-enCv13',VTc 2 [>= 4 C tMu%nr&{o"дC҂;u>9A`OŅ n8oW(]j 6G5Ž~ߍfo?r{*Z?!(4Cs -\oʗg "Mke 4#q5}W hb \/6DB0XrۨjXsmTt xKpj@^} AdέxMs҆ʎbV`ޣz<?vx8;aOuwIb+,F. VhiZ Cbo0eՇkUmu6^^7 >4 6 ;OjN᪥N!HS~uSHtKKn tD\]ZʓQ{O/>֘䇶?~^9c^L;=[H%^ m_YWH{ \-PXoSX붻 \{sBRW\z6=Zϗ\z_zZJV2!/o1JcJ8_nךҽprYf2kM` ¿W?NtMZi?Fm9]}:/|(jn4/O2`\pJ߸fWJv>]؟ sKPiM~lrdq޾C*Q,̯ nmbQ>-8Rqy6UV0k5Yȶ&ն&ˇ*47yW[y-wԥز6".nI#h: z264v.Yt]v?qjoRW6+:{iwbZIkv LA +-Lj9G\oj,$r(I&ٛdh`Zj1@VQ qx鼛;!=!)Ղ p#N*h{;rPkr !{ޜUM ]Իf{?ѬbwEӃ5Ccp;&?bs}W3 2 =nj>wiZb`j0^-6~ES P>ԕA=R%bCu7»CR Z.\']C`bR?<" bA* *AWUiXi6ED;nS)8ewwlJзgFo6(X ..Zpm*7 ̢Z=j9=Wo?t,o6Zx^}WmG$xtDܭ:p߂i^Yj~ȉj=ŗE=}ܻ׎ˢJMV7wY3 ~U{͠/`~_~JW0b\Vwٙ1K'1zvF7MsV4I37SxlM}9$],ۼYְGج=I:G,QqldC TPeΕ[Lc$p9L3_ߓjc5oψOzK`{)֛um 5"x?^uQ{ݠlQ~8P\kv-cr\%Ԃ\I9S؉snܙWN+cw UKJ%Pq#F -PHsjSQLE/vaQ4Z0G Q(UXcQ٧82Ԩ8 z5E*ůBJI-hiˉ_eRƂGp(|@`EŏϔK I pWC8tB#FqJ~II+߯pVN0g;L%{]ᄒ}/GώꝗF߃R=Y95hL2E!X:"kw= Fm M$kx-MDF}5xL*5p8nF/1"f;%=CUߑX?>D#58eVX[416I&1t6 EzY}Izp=eSC=oE)͋Z3W/[J>w7B!,´ލ@rMfӯzL3{d4kQ:}u6ɵH߽xQX\FlH*}>YﯫJGEo_;J=7#Z})xM1SiޜAɪ`1T2#N(sNbeF'.2ZNM_H+Otq ٔǞ6U=FO/ƩQ \kr ُ,LlD;@W$9E@eRřt0՞WW'U'X&9*dk#ReCM8bH4 xz[@c1YH^uAwPtG4X jJܙƲee+[MWBYLo&Y @q|[ǵ2w vHq`pRvN|0%~i0qf+0E4\v&xڔ 8+&UOkxٚﵙw@4gyɼ?樍sf&uҾk ]cX"nfb[kv/0SӵS*Fv+ O\Zpyr|̱_CO1ƑX]TAɭNc,FVZ,A׮P 2"H6P!a5x㺈јe%P5$U *x#lF'LP0d !~H2@1"8]rbCV}r@+A9AU% !ޒ}a@!DɤL/  QC(d8R:4lx@}q>"%2[nMC*]kG:WQqԗ,R| _W߷{ea{ 8RCA3 ;wޭO>ߓ>E>V_3߫O{vKNެ5-_x.t&BrvDҥLjPݱ4v n榢ǤT!ߏ-EDe0gAy5鸶Ҷxf,0__Wf`:ˉ:w~]")Oy=#]LBae#Ȝw!ÄTت)|ˑ⫗*SHADIбzc'\ 䪽ۙ43@cl:s[:=i"vf' n; 8o`IiK/8 :D#5.D!6I5=c`Xδ(Bېm__W/G$]Ihrt?3=oE) X_&@cJ0knhh0vhJ֬(VG*tJ}&Gs 2yd7D8 tYNl5ZOKQ1jԿN_,rw4唸@LגHC[3v]v ˧5xN^d빋ji>ӟl SllDO/F+Ϗj ZE\0^ ;eD3L`J0!"ngB҃kB abkwf8tޤVt6kL.?] L }󯮯N6y<NTiH4و R8z#*&Ky X,1 Ɗo!" &NrБ( (H[tOQ6ϻ*V2Q}U*E\vHWP.֊W@V5MV፽\)ryTQX7UVHꎏ 槰L Nຩ+sިmK\e5wOŷrvoZqՅX*1zQ@f1ֽmM1/j\qf̴i`VW%e2T/jkڨN 3UiDnRf[е;R[~|xecmڙvE^Vxisfaڏlm2G\ +> *BfVxig8܊ P6hO$'KQmd{AUҊ'g9 -Udo6y:]zõу}y$QPA x!C*!/P!!( 9B!}B&:]BX"+vXRKЇLRÀB1#"rCs ! @ `y%CFFAj]kGj(J*Ko<_xeוZO->9Dh/7H0?ny?=¿~OX}[3ⓞ>!>{=C*{)֛z aW]C7h2-~FMHt|DXdrXGZ07We^}Pͮ4Sdv }2;l BOyK@'`!kzWjۧh/=L HbTo}LI(%Z_Z(ؕB2'|Wr@ޏ- T{a׌*bq1mYXпv~yAi%,hK>:E)g䣋5V2RyAA<͸U3V U@NLT<@Kj7\bzӹ1Fčuh +'_+'op 2,:z\t rpZUټq}MƸ>ulXV {>8Y@!s[)N+nXy%'A9#r?o@sh};@>-kax aX`" 4}nh#i3'=};7;Vs i:ٲ6˔t(V $J(7=`!@SW""PRV!ڥR6@;6978:P~M|DK3ݬę ~hb#st6Odryv2Y qb`K:S&Ԇi=K81kNqd`uM䛏{Y=l):רg~6_e/9ʚ]bɨ4u*OA Iƒ 8adzXʼn0V ;;<7v*<kP5-$n/W t7l1Q-)`}UysCpb_*qOJ{NuX .vod{1 )oHӷ 6@CW?E/oj.Uj %~$3W]%~jL2[m29+Xνl⥗½X%a6 P{2Y$'5P-2T&z߼Sr ])b4B4Y?w7\u5)YDiǽGԛ̦_>ŧ0hd$udmk{O纄/am1vKQ::.:/: ult]S e@8]>sB Xj-v?ْN1uRt)2u^N2H UlZE͠0^M`RfOԺ9":ٰ =bQlm8t\4]~ݻ8!&sI_'|:쾺tà@qJo#0o61\z x_l*/(c-*VX;}7O1 ftH~_Ž"h;VV35E"ED>'m7Dr1tK*](Z6xb"hN8#jYÌaқ\9X㾷*>;]qo*T~@eVo7j*nf_R˄>?+BoQW''s~VuFv9v4\%B+-,'On%?NQ japun\I]́q:P!Bh\y.0ؓ'7 o7s1 )^;hD#Pڽ2m}wQ{z.y>9. E>=2^Jq}s7QMyIY|^5~Q4idr}vzʸR;3ښӋ}7LugUrm9[KZT Urdjqǖ$8c@ \4,6ڑC,B64[$뢸kKq.;Ж Հ\sETb3<6ݣ@C,/I[PoӇ^sq#hcj֡Œ`2 F=[ 1,6 haOgapkSRZC< {]Fҭth([<.٫A ^  =AQ(3O@1LJD8@P۲[yu[Dc"FhC#C2U&Y&6;NӜ)ZnfJWB1+ Ð//ziF $P/{te>]7ZlF4+;0.6u4s65s6YK!fΎX5B׀*q_28ni`$jzKPLHi,q9rѤ0Y8Ydp5#_ɗY|/j Ѵ_QO\҆8P{V;"O !~4'#:2A"sbDP\Y(V m;'`a ?X0*2G2G+I@&vǭt[FUЖ)Tc4D hyq1eT3N`/rF<@gAe4OWFYKrIlۖ4A?-ҢCq4y4w\ÄLj"U: {@{wTVÔ#!,T%`$5DJ8%!=J?Js\]ގ0HK e -__ncuH LD8hJC@ڕB1VQtj6)'D-5!daFk;ڵ\+͎ѵ WM  ֥]ڗ&EZ!t,@徘:{PrK^"b &e[gDM )CshY'WVGs R#zˍI_Ml.54XIЁV3@ C" ma9Bx&f`] ei#Ԅ?d=e2G"H(C'+C<8EQz@lci;՗ o2f@74nWF*JO ?腨bZkik#hpDTD'4nK/;OjM-Hi .i/\xAU%pW#rw>ǥ:\4jKZitXr5&~E5BuHh"]}YZ^h QniZ[eM4 3 2!yAz]\;zkkr#JrL'$u&KK HKHv u#^P˫O;^++z#9`eY9z=C8`7[ }]9'RU[BDtkA!V !~dhF*@PMtMȗ򗗌 B|q  $j6Xq>t=hb wX:C^NhS?LJ}/?y7n`?TNڙ~L-8Zm8h|/;|yf_c2R`fu֥)-J rkz'w4CyQy&+FM'qt;ëI1Tcn5%l',~@Fgf^:+ǫ4B{w<z8_bgJ:s|]L:&S\*qe87I<ވ4rhޒpm%SRtWYCŠ}&L`䩮ݷߵvkhݶ!!/\D/SWZ;1M\\),%7__NekXgCpF@uV`\QgWOf&Q>}R斎E_& PrL F!ь&MB=ϸ c-9Ut مӱk>LoĢڸ[4cKp¯>$n9wFX(P`3kY?ˤ);qM<ǝ-od`xEQO5֊n~ڜ\ḟ^jMenQ< ]Jv2Új)gzj%H\q* sĘ. >h73lcfd-5\ӶhC 1]u) ypPg!Č}cZ9gs+Ms+e{?.'cʣsapwOcn: ѫ*EiʆӔB+ n2"h Qei9+,8R-x46`a*hJj2Ř߆Ґ6 zY,u;Pcn5.+( bNNeM{{+5~#ej_̓owtf?XUO?꜡Ug``'?!ĬЫQ>Y\^GÐPFw *x2Ȼi/7z-nܹ<*tJ('Onپ2DBܮAq-8h\͑q:Pu6Rp&(9|*6s1w1jچg y!ܩ8V&3B?e OnvVlm*.N"H/#dG=qQ b"Egܸ1MϽm_#9#z hP 籂Ț@o$fT(V/}3-EmgΩOLQtP >jPlK,XzO5zGLӜJ/.Jhr$;VEc9 [k+h/Cd-X3~%jbt]қcJuf9镚3-H轁^ Bxuu-^c:4Q36޿cRUHV3UjTHK1à%a3?hՀ%z q^u4 P|.mR$ {nʉ!4m$#16n zxvSc۸yPdF#ͣ]pKk|xԺSLKvzY;q d UJQI Q%9Fj+ǥ_|Q2{6M<0BWO;sa>) ϟ~ҷ .[Way%Vo ,C|(F #yf{"ǯJI pA%#rTr꾊TBn /JF{k%O"N+',Ɲ=C~O*~4.E8w"W $a!aΔ ׍?^uFMg;cQ\O߼KQ4eVJ= #F9_ZБո oT,qwT݀Mxۈׇv&a=7-,FI[$8D=;`1CzF<1fuU塳($РZ6fϚC^zi^صjLGo0VU_H}nó35`llC\$kAk{Q-uP , BK5_ 5Az>t#Ƹo=FQgA =6欜#*@F_J+;O;458W/~pQэDUTCG1,SnÃБuVgNQņy|ãJhQ!0 K碙f;@).o e~ Izi'FKS^5-vӝǞK`d L{y_%j=QG.A~(PG9*RbO.Ig"YM'#a>NՑQlXh PH`Gl/0l'W'.։_*?i{ e$qt%0{?%J\-BKTX}xiu}G]\"M[EGs]buQ|ȦJH5RSAf؝R°mxB6z"l(IczD bmaKK44du>'+=t[eƒ/&ΖeV̓q|CСrUaDG% $5@`GJvfmʨH(0$_>. ~_>|tt3o[ |4Xoh+Kj!ޘP/Fʋ;pU.3M+gH@4)~3~%kqtEUq]"M%%0[SZvAnH@&L* c.MіPeUoRAGgoWWe1fަhE+W:ͥZ\ǧ8UW3wr"ZLЍOˤ~OfuJ%_^-5*Z,")o[LUF$ VA 7{yZ08(r|!_j 3ϫ޸pc6+}s퇛he|{wΛ]i?O'ٯ~ Bg% n}o?ۻ̈́T^o} T}ȏMH5v>=dž<,ߍTysd!oDl3-xR bL)mziVN|氐7n%6eo> 13145ms (10:51:48.260) Jan 23 10:51:48 crc kubenswrapper[4957]: Trace[410460384]: [13.145957855s] [13.145957855s] END Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.260803 4957 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.262006 4957 trace.go:236] Trace[125811609]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Jan-2026 10:51:33.423) (total time: 14838ms): Jan 23 10:51:48 crc kubenswrapper[4957]: Trace[125811609]: ---"Objects listed" error: 14838ms (10:51:48.261) Jan 23 10:51:48 crc kubenswrapper[4957]: Trace[125811609]: [14.838673255s] [14.838673255s] END Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.262073 4957 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.262010 4957 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.262595 4957 trace.go:236] Trace[670785966]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Jan-2026 10:51:34.610) (total time: 13651ms): Jan 23 10:51:48 crc kubenswrapper[4957]: Trace[670785966]: ---"Objects listed" error: 13651ms (10:51:48.262) Jan 23 10:51:48 crc kubenswrapper[4957]: Trace[670785966]: [13.651592205s] [13.651592205s] END Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.262641 4957 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.262758 4957 trace.go:236] Trace[405140664]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Jan-2026 10:51:33.964) (total time: 14297ms): Jan 23 10:51:48 crc kubenswrapper[4957]: Trace[405140664]: ---"Objects listed" error: 14297ms (10:51:48.262) Jan 23 10:51:48 crc kubenswrapper[4957]: Trace[405140664]: [14.297741156s] [14.297741156s] END Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.262800 4957 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.262801 4957 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.671871 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.677578 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.705260 4957 apiserver.go:52] "Watching apiserver" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.707226 4957 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.707484 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.707794 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.708046 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.708132 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.708217 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.708130 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.708263 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.708328 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.708417 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.708465 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.710833 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.710877 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.710833 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.710907 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.711037 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.711101 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.711174 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.711331 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.716375 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.719650 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 12:06:21.063076179 +0000 UTC Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.753494 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.763745 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.781927 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.797984 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.807863 4957 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.809086 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.818634 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.836869 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.853117 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.866897 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.866941 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.866962 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.866976 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.866993 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867007 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867022 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867037 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867051 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867067 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867086 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867105 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867120 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867138 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867153 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867189 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867203 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867221 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867236 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867257 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867293 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867311 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867331 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867348 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867362 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867378 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867395 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867418 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867435 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867460 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867476 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867493 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867510 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867524 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867544 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867569 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867591 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867592 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867599 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867607 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867667 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867707 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867741 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867775 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867808 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867841 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867873 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867906 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867937 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867969 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868001 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868034 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868066 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868104 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868136 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868202 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868233 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868266 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868339 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868371 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868404 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868435 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868469 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868503 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868533 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868568 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868599 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868632 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868665 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868702 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868736 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868767 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868803 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868837 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868869 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868901 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868933 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868966 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869005 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869037 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869071 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869105 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869140 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869173 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869206 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869243 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869297 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869332 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869364 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869396 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869427 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869463 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869495 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869528 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869559 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869592 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869631 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869664 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869697 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869729 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869764 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869806 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869839 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869871 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869906 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869937 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869971 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870010 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870043 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870076 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870115 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870173 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870208 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870243 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870300 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870333 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870366 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870406 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870443 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870477 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870513 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870546 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870581 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870616 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870648 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870687 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870723 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870760 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870794 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870830 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870862 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870895 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870930 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870968 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871001 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871035 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871071 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871107 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871142 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871178 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871220 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871254 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871352 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871387 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871421 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871456 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871490 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871529 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871566 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871605 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871641 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871675 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871709 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871743 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871778 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871811 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871844 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871879 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871913 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871950 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871996 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872073 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872136 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872210 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872306 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872400 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872464 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872515 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872577 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872772 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872832 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872882 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872940 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872983 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873023 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873059 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873094 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873143 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873194 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873239 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873336 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873391 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873428 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873468 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873507 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873544 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873580 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873615 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873652 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873793 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873819 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873837 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873855 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873875 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873893 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873910 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873928 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873971 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873993 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874133 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874211 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874233 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874315 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874345 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874367 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874387 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874409 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874433 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874458 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874481 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874536 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874706 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874723 4957 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883831 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867807 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867982 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868007 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.867995 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868318 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868359 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885235 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885332 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885356 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868364 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868424 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868446 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868614 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868631 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868780 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.868876 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869036 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886232 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869270 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869369 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869325 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869456 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869605 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869685 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869708 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869867 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870078 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870362 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870426 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870475 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870491 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870552 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.870629 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871004 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871068 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871120 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871230 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871333 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871485 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886927 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871518 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871587 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871653 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871901 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.871906 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872713 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872717 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872726 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872942 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.872966 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873133 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873163 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873202 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873436 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873439 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.873660 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874103 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874124 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874213 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874269 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874351 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874545 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874766 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.874900 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.875262 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.875579 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.875661 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.875943 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.876079 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.876184 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.876655 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.877425 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.877446 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.877836 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878136 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878313 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878352 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878268 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878533 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878497 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878745 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.878820 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.879031 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.879211 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.879300 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.879837 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880049 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880204 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880226 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880402 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880695 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880769 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880830 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.880881 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881026 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881070 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881324 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881445 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881664 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881843 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881853 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881884 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.881953 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.882169 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.882429 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.882593 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.882837 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883003 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883095 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883182 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883215 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883228 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883402 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.883690 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.884100 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.884251 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.884427 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.884427 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:51:49.384266282 +0000 UTC m=+18.921519009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.884647 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.884766 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.884843 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885025 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885103 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885148 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885124 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885381 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885423 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885451 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885710 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885727 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.885806 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886002 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886067 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.887484 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214"} Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.887455 4957 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214" exitCode=255 Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886136 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886150 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886169 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886550 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.869096 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886778 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.886957 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.887016 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.887120 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.887381 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.887530 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.887401 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.887948 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.888323 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.888424 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.888537 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.888530 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.888590 4957 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.888710 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:49.38869306 +0000 UTC m=+18.925945927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.888850 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.889058 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.889185 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:49.389157242 +0000 UTC m=+18.926410119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.891463 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.891519 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.891812 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.892874 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.893207 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.893562 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.893777 4957 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.894342 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.894961 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.895134 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.896005 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.900793 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.900799 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.900973 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.901003 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.901047 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.906868 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.906907 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.906928 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.906995 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:49.406974124 +0000 UTC m=+18.944226821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.907205 4957 scope.go:117] "RemoveContainer" containerID="5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.907604 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.907918 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.907928 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.908163 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.908602 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.908912 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.909050 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.909256 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.910406 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.912407 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.912456 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.912482 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:48 crc kubenswrapper[4957]: E0123 10:51:48.912572 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:49.412543962 +0000 UTC m=+18.949796819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.913031 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.913067 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.913260 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.913554 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.913817 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.914230 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.915421 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917012 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917082 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917149 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917146 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917400 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917415 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917524 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917575 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917757 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917858 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.917988 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.918431 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.918774 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.919067 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.919595 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.921835 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.921996 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.922267 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.923684 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.927913 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.929795 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.936820 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.943271 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.944805 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.946763 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.947450 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.961154 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.971367 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.975694 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.975765 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.975945 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.975970 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976088 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976154 4957 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976176 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976197 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976216 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976234 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976252 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976270 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976339 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976359 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976378 4957 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976394 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976411 4957 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976429 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976447 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976465 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976483 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976501 4957 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976518 4957 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976535 4957 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976555 4957 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976572 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976589 4957 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976606 4957 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976623 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976636 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976644 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976652 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976660 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976669 4957 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976677 4957 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976685 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976692 4957 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976701 4957 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976709 4957 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976717 4957 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976725 4957 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976733 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976741 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976749 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976757 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976765 4957 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976772 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976780 4957 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976788 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976796 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976804 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976812 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976821 4957 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976830 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976842 4957 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976852 4957 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976860 4957 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976868 4957 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976876 4957 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976884 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976893 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976901 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976909 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976918 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976926 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976934 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976942 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976950 4957 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976958 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976965 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976973 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976981 4957 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976989 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.976999 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977007 4957 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977016 4957 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977024 4957 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977034 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977042 4957 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977050 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977059 4957 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977066 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977075 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977083 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977092 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977099 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977107 4957 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977115 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977123 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977132 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977140 4957 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977147 4957 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977155 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977163 4957 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977171 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977180 4957 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977188 4957 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977197 4957 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977205 4957 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977212 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977220 4957 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977228 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977236 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977244 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977252 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977260 4957 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977267 4957 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977275 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977295 4957 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977304 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977311 4957 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977319 4957 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977326 4957 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977336 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977345 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977352 4957 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977362 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977370 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977378 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977387 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977396 4957 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977404 4957 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977412 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977421 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977429 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977438 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977445 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977453 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977461 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977470 4957 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977478 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977486 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977494 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977502 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977520 4957 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977528 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977536 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977547 4957 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977555 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977563 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977571 4957 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977579 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977589 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977597 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977606 4957 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977614 4957 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977622 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977630 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977637 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977645 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977653 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977661 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977669 4957 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977677 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977685 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977693 4957 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977701 4957 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977710 4957 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977718 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977726 4957 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977734 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977743 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977752 4957 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977760 4957 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977768 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977776 4957 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977784 4957 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977791 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977799 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977808 4957 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977816 4957 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977824 4957 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977833 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977841 4957 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977848 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977856 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977864 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977871 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977879 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977887 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977895 4957 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977903 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977913 4957 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977921 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977930 4957 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977939 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977947 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977955 4957 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977963 4957 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977974 4957 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977983 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977991 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.977999 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.978007 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.978015 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:48 crc kubenswrapper[4957]: I0123 10:51:48.978023 4957 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.022823 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.027819 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.034824 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 23 10:51:49 crc kubenswrapper[4957]: W0123 10:51:49.041739 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-173fec8f7d1e300e262cddf05edb9a825acc97c1bde59c5ed55dac5f66b7043f WatchSource:0}: Error finding container 173fec8f7d1e300e262cddf05edb9a825acc97c1bde59c5ed55dac5f66b7043f: Status 404 returned error can't find the container with id 173fec8f7d1e300e262cddf05edb9a825acc97c1bde59c5ed55dac5f66b7043f Jan 23 10:51:49 crc kubenswrapper[4957]: W0123 10:51:49.050305 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8d902d44e781e91d50867acb478204cccb2b7b3e03485d148d01f7d59b98bbea WatchSource:0}: Error finding container 8d902d44e781e91d50867acb478204cccb2b7b3e03485d148d01f7d59b98bbea: Status 404 returned error can't find the container with id 8d902d44e781e91d50867acb478204cccb2b7b3e03485d148d01f7d59b98bbea Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.081663 4957 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.112560 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.126841 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.133531 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.134732 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.138334 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.152711 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.163386 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.174796 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.187841 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.204996 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.216106 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.234626 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.258707 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.275834 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.288924 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.301448 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.313672 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.324242 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.336663 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.347577 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.481851 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.481992 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:51:50.481949023 +0000 UTC m=+20.019201710 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.482076 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.482102 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482231 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482248 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482261 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482324 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:50.482314404 +0000 UTC m=+20.019567091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.482351 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.482405 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482346 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482495 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482508 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482516 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482517 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:50.482495768 +0000 UTC m=+20.019748545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482584 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:50.48257487 +0000 UTC m=+20.019827647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482427 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: E0123 10:51:49.482643 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:50.482635172 +0000 UTC m=+20.019887979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.678246 4957 csr.go:261] certificate signing request csr-ff24x is approved, waiting to be issued Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.702717 4957 csr.go:257] certificate signing request csr-ff24x is issued Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.712433 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fnxz6"] Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.712694 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.715476 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.716572 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.717022 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.720216 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 11:23:08.133460804 +0000 UTC Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.737768 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.785835 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj687\" (UniqueName: \"kubernetes.io/projected/4c7b1449-2e9b-4c07-a531-591cb968f511-kube-api-access-gj687\") pod \"node-resolver-fnxz6\" (UID: \"4c7b1449-2e9b-4c07-a531-591cb968f511\") " pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.785892 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c7b1449-2e9b-4c07-a531-591cb968f511-hosts-file\") pod \"node-resolver-fnxz6\" (UID: \"4c7b1449-2e9b-4c07-a531-591cb968f511\") " pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.811829 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.817405 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rg9hb"] Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.817646 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.818895 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.820108 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.820955 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.821114 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.832497 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.845960 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.857856 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.873647 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.886779 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj687\" (UniqueName: \"kubernetes.io/projected/4c7b1449-2e9b-4c07-a531-591cb968f511-kube-api-access-gj687\") pod \"node-resolver-fnxz6\" (UID: \"4c7b1449-2e9b-4c07-a531-591cb968f511\") " pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.886817 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c7b1449-2e9b-4c07-a531-591cb968f511-hosts-file\") pod \"node-resolver-fnxz6\" (UID: \"4c7b1449-2e9b-4c07-a531-591cb968f511\") " pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.886897 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c7b1449-2e9b-4c07-a531-591cb968f511-hosts-file\") pod \"node-resolver-fnxz6\" (UID: \"4c7b1449-2e9b-4c07-a531-591cb968f511\") " pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.889525 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.891969 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.893425 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a"} Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.893696 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.903065 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8d902d44e781e91d50867acb478204cccb2b7b3e03485d148d01f7d59b98bbea"} Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.904541 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56"} Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.904663 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356"} Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.904749 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"173fec8f7d1e300e262cddf05edb9a825acc97c1bde59c5ed55dac5f66b7043f"} Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.905587 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513"} Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.905634 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"51bfe1340d43f4bd22b3da43acd20a37344d0734d62fc56d5f22fcc13f9704a2"} Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.907528 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.926777 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj687\" (UniqueName: \"kubernetes.io/projected/4c7b1449-2e9b-4c07-a531-591cb968f511-kube-api-access-gj687\") pod \"node-resolver-fnxz6\" (UID: \"4c7b1449-2e9b-4c07-a531-591cb968f511\") " pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.931748 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.942811 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.966829 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.978295 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.987325 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wngq\" (UniqueName: \"kubernetes.io/projected/48a6ddd9-627a-4faa-a4c4-096ea19af31d-kube-api-access-5wngq\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.987550 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48a6ddd9-627a-4faa-a4c4-096ea19af31d-serviceca\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.987786 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48a6ddd9-627a-4faa-a4c4-096ea19af31d-host\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:49 crc kubenswrapper[4957]: I0123 10:51:49.994477 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:49Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.005799 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.025541 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fnxz6" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.025692 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.059249 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.088800 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wngq\" (UniqueName: \"kubernetes.io/projected/48a6ddd9-627a-4faa-a4c4-096ea19af31d-kube-api-access-5wngq\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.088878 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48a6ddd9-627a-4faa-a4c4-096ea19af31d-serviceca\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.088913 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48a6ddd9-627a-4faa-a4c4-096ea19af31d-host\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.088958 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48a6ddd9-627a-4faa-a4c4-096ea19af31d-host\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.089713 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48a6ddd9-627a-4faa-a4c4-096ea19af31d-serviceca\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.124819 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.145350 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.162180 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.176228 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.177822 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-w2xjv"] Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.178124 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.180643 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wngq\" (UniqueName: \"kubernetes.io/projected/48a6ddd9-627a-4faa-a4c4-096ea19af31d-kube-api-access-5wngq\") pod \"node-ca-rg9hb\" (UID: \"48a6ddd9-627a-4faa-a4c4-096ea19af31d\") " pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.182420 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.182669 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.184757 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.184787 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.185564 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.195502 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.221098 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.234012 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.248569 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.263596 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.279220 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.290927 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26mf\" (UniqueName: \"kubernetes.io/projected/224e3211-1f68-4673-8975-7e71b1e513d0-kube-api-access-s26mf\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.290937 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.290981 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/224e3211-1f68-4673-8975-7e71b1e513d0-rootfs\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.291122 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/224e3211-1f68-4673-8975-7e71b1e513d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.291156 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/224e3211-1f68-4673-8975-7e71b1e513d0-proxy-tls\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.305099 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.329524 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.346265 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.370698 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.424023 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/224e3211-1f68-4673-8975-7e71b1e513d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.424063 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/224e3211-1f68-4673-8975-7e71b1e513d0-proxy-tls\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.424081 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26mf\" (UniqueName: \"kubernetes.io/projected/224e3211-1f68-4673-8975-7e71b1e513d0-kube-api-access-s26mf\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.424105 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/224e3211-1f68-4673-8975-7e71b1e513d0-rootfs\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.424292 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/224e3211-1f68-4673-8975-7e71b1e513d0-rootfs\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.424802 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/224e3211-1f68-4673-8975-7e71b1e513d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.427154 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/224e3211-1f68-4673-8975-7e71b1e513d0-proxy-tls\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.428970 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rg9hb" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.458793 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26mf\" (UniqueName: \"kubernetes.io/projected/224e3211-1f68-4673-8975-7e71b1e513d0-kube-api-access-s26mf\") pod \"machine-config-daemon-w2xjv\" (UID: \"224e3211-1f68-4673-8975-7e71b1e513d0\") " pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.460421 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.474189 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.501428 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.525409 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.525465 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.525488 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.525508 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.525523 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525580 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:51:52.525554027 +0000 UTC m=+22.062806714 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525600 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525630 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525655 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:52.525642309 +0000 UTC m=+22.062894996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525716 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:52.525699111 +0000 UTC m=+22.062951798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525740 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525780 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525795 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525856 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:52.525836554 +0000 UTC m=+22.063089311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525927 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525937 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525945 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.525973 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:52.525966988 +0000 UTC m=+22.063219675 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.532864 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod224e3211_1f68_4673_8975_7e71b1e513d0.slice/crio-93211bb9cb7035b60b7e4501bf9c42738080ba548daad6aa9af142020972a665 WatchSource:0}: Error finding container 93211bb9cb7035b60b7e4501bf9c42738080ba548daad6aa9af142020972a665: Status 404 returned error can't find the container with id 93211bb9cb7035b60b7e4501bf9c42738080ba548daad6aa9af142020972a665 Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.578623 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tlz2g"] Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.578908 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.584430 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.584584 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6cq2v"] Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.585202 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.592684 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.593191 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.593302 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.593337 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.593424 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.593525 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.612481 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626145 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-cni-bin\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626208 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-socket-dir-parent\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626245 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-system-cni-dir\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626261 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626291 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-etc-kubernetes\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626308 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-cnibin\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626321 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-multus-certs\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626335 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8wk\" (UniqueName: \"kubernetes.io/projected/11d94cd0-1619-4ef6-952a-aef84e1cdc75-kube-api-access-zd8wk\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626373 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-kubelet\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626399 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/233fdd78-4010-4fe8-9068-ee47d8ff25d1-kube-api-access-jpwrq\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626422 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-netns\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626438 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-k8s-cni-cncf-io\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626461 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cni-binary-copy\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626477 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-hostroot\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626492 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-system-cni-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626506 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/233fdd78-4010-4fe8-9068-ee47d8ff25d1-cni-binary-copy\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626520 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-daemon-config\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626582 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cnibin\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626604 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-os-release\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626621 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-cni-multus\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626682 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-os-release\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626698 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-conf-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626714 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.626733 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-cni-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.627253 4957 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.627584 4957 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.627592 4957 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.627901 4957 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628098 4957 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628133 4957 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628135 4957 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628134 4957 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628162 4957 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628165 4957 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628195 4957 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628208 4957 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628217 4957 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628223 4957 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628231 4957 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628240 4957 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628243 4957 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.628231 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf/status\": read tcp 38.102.83.9:37784->38.102.83.9:6443: use of closed network connection" Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628253 4957 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628258 4957 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.628265 4957 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.651259 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.663081 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.689169 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.703949 4957 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-23 10:46:49 +0000 UTC, rotation deadline is 2026-10-12 15:17:41.495458111 +0000 UTC Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.704036 4957 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6292h25m50.791425747s for next certificate rotation Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.705263 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.721203 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 11:53:00.761507253 +0000 UTC Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.727908 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-socket-dir-parent\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728187 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-system-cni-dir\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728405 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728535 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-etc-kubernetes\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728626 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728249 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-system-cni-dir\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728579 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-etc-kubernetes\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728052 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-socket-dir-parent\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728852 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-cnibin\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728984 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-multus-certs\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729091 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8wk\" (UniqueName: \"kubernetes.io/projected/11d94cd0-1619-4ef6-952a-aef84e1cdc75-kube-api-access-zd8wk\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729205 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-kubelet\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729339 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/233fdd78-4010-4fe8-9068-ee47d8ff25d1-kube-api-access-jpwrq\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729462 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-k8s-cni-cncf-io\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729037 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.728897 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-cnibin\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729366 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-kubelet\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729123 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-multus-certs\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729522 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-k8s-cni-cncf-io\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729590 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-netns\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729750 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cni-binary-copy\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729774 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-hostroot\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729792 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-system-cni-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729809 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/233fdd78-4010-4fe8-9068-ee47d8ff25d1-cni-binary-copy\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729825 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-daemon-config\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729844 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cnibin\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729860 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-os-release\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729876 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-cni-multus\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729879 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-hostroot\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729896 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730719 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-os-release\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730749 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-conf-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730773 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-cni-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730799 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-cni-bin\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730263 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-run-netns\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730545 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-daemon-config\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730550 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cni-binary-copy\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730576 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730630 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/233fdd78-4010-4fe8-9068-ee47d8ff25d1-cni-binary-copy\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.729969 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-cnibin\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730990 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-os-release\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730151 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-cni-multus\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.731042 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-conf-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730209 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11d94cd0-1619-4ef6-952a-aef84e1cdc75-os-release\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.731115 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-multus-cni-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.730239 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-system-cni-dir\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.731162 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/233fdd78-4010-4fe8-9068-ee47d8ff25d1-host-var-lib-cni-bin\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.740705 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.745639 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/233fdd78-4010-4fe8-9068-ee47d8ff25d1-kube-api-access-jpwrq\") pod \"multus-tlz2g\" (UID: \"233fdd78-4010-4fe8-9068-ee47d8ff25d1\") " pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.745775 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8wk\" (UniqueName: \"kubernetes.io/projected/11d94cd0-1619-4ef6-952a-aef84e1cdc75-kube-api-access-zd8wk\") pod \"multus-additional-cni-plugins-6cq2v\" (UID: \"11d94cd0-1619-4ef6-952a-aef84e1cdc75\") " pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.755310 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.767666 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.768829 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.768871 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.768885 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.768970 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.769107 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:51:50 crc kubenswrapper[4957]: E0123 10:51:50.769243 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.773982 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.774685 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.775382 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.776069 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.776763 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.777312 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.777845 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.778444 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.779075 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.781059 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.781672 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.786027 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.786971 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.787752 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.788302 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.790660 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.791303 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.791650 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.792075 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.792634 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.793202 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.794100 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.794659 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.795481 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.796102 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.796576 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.797557 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.798181 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.799028 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.799671 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.800505 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.800948 4957 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.801044 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.802972 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.803584 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.804001 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.805438 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.806399 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.806923 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.807859 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.808168 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.809018 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.809991 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.810687 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.811747 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.812892 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.813411 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.814247 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.814801 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.815851 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.816313 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.816786 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.817587 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.818091 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.819047 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.819524 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.823678 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.837834 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.854856 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.872030 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.889381 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.893114 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tlz2g" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.908157 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.913183 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320"} Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.913480 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae"} Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.913580 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"93211bb9cb7035b60b7e4501bf9c42738080ba548daad6aa9af142020972a665"} Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.914540 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fnxz6" event={"ID":"4c7b1449-2e9b-4c07-a531-591cb968f511","Type":"ContainerStarted","Data":"6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd"} Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.914667 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fnxz6" event={"ID":"4c7b1449-2e9b-4c07-a531-591cb968f511","Type":"ContainerStarted","Data":"eb105c36831b91d707fae49091289bb2167b4441e975005dd921fb79174b8347"} Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.915889 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rg9hb" event={"ID":"48a6ddd9-627a-4faa-a4c4-096ea19af31d","Type":"ContainerStarted","Data":"f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63"} Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.915930 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rg9hb" event={"ID":"48a6ddd9-627a-4faa-a4c4-096ea19af31d","Type":"ContainerStarted","Data":"2b6d1077e20ddbfbf6f01ec089b4d47921b9818f5f9ae64d32ccd862609458ce"} Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.917702 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.956507 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.965762 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8hcw"] Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.966530 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.986718 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod233fdd78_4010_4fe8_9068_ee47d8ff25d1.slice/crio-be973d97d5637dba9f626db62dbee50b16fc2ff16d87bff515a2ff8f8dbf7fec WatchSource:0}: Error finding container be973d97d5637dba9f626db62dbee50b16fc2ff16d87bff515a2ff8f8dbf7fec: Status 404 returned error can't find the container with id be973d97d5637dba9f626db62dbee50b16fc2ff16d87bff515a2ff8f8dbf7fec Jan 23 10:51:50 crc kubenswrapper[4957]: W0123 10:51:50.986981 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d94cd0_1619_4ef6_952a_aef84e1cdc75.slice/crio-d21f663265978d40f1be7a9066dc900eac13445f69495738f3b1c6cc224195c9 WatchSource:0}: Error finding container d21f663265978d40f1be7a9066dc900eac13445f69495738f3b1c6cc224195c9: Status 404 returned error can't find the container with id d21f663265978d40f1be7a9066dc900eac13445f69495738f3b1c6cc224195c9 Jan 23 10:51:50 crc kubenswrapper[4957]: I0123 10:51:50.989495 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.007437 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.029258 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.034902 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-script-lib\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.034957 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-config\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.034980 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovn-node-metrics-cert\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035002 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhgm\" (UniqueName: \"kubernetes.io/projected/87adc28a-89e3-4743-a9f2-098d4a9432d8-kube-api-access-8nhgm\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035023 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-netd\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035046 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-kubelet\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035082 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-netns\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035111 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-env-overrides\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035154 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-systemd\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035204 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-systemd-units\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035240 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-slash\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035261 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035301 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-var-lib-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035339 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-ovn\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035390 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-log-socket\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035425 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035451 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-node-log\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035481 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035512 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-etc-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.035541 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-bin\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.047390 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.067863 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.087619 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.107681 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.135830 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136167 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-var-lib-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136418 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-ovn\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136443 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-log-socket\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136474 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136493 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-node-log\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136507 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-ovn\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136533 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-log-socket\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136547 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136252 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-var-lib-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136565 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-node-log\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136514 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136593 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-etc-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136613 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-bin\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136637 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-config\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136639 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-etc-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136636 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-openvswitch\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136661 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovn-node-metrics-cert\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136671 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-bin\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136687 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-script-lib\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136712 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhgm\" (UniqueName: \"kubernetes.io/projected/87adc28a-89e3-4743-a9f2-098d4a9432d8-kube-api-access-8nhgm\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136770 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-netd\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136797 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-kubelet\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136833 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-netns\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136890 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-env-overrides\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136914 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-systemd\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136923 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-kubelet\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136941 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-systemd-units\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136965 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-slash\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136970 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-systemd-units\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136913 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-netd\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136989 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136969 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-netns\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.136996 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-systemd\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.137007 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-slash\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.137382 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-env-overrides\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.137382 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.137465 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-script-lib\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.137619 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-config\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.141619 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovn-node-metrics-cert\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.194652 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhgm\" (UniqueName: \"kubernetes.io/projected/87adc28a-89e3-4743-a9f2-098d4a9432d8-kube-api-access-8nhgm\") pod \"ovnkube-node-z8hcw\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.206640 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.260086 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.286171 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.320132 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.373789 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.395006 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.433537 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.462673 4957 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.464077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.464103 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.464112 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.464190 4957 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.475609 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.527845 4957 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.528170 4957 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.529197 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.529332 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.529405 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.529472 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.529537 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: E0123 10:51:51.545929 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.550560 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.550601 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.550612 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.550629 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.550638 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.557787 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: E0123 10:51:51.563641 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.567202 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.567233 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.567245 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.567261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.567272 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: E0123 10:51:51.579333 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.583979 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.584017 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.584028 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.584042 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.584053 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.586890 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 23 10:51:51 crc kubenswrapper[4957]: E0123 10:51:51.596721 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.600238 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.600265 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.600275 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.600307 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.600319 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.609521 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:51 crc kubenswrapper[4957]: E0123 10:51:51.616755 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: E0123 10:51:51.616909 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.618262 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.618302 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.618317 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.618332 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.618344 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: W0123 10:51:51.620168 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87adc28a_89e3_4743_a9f2_098d4a9432d8.slice/crio-7e83039e1bb245b00fe052661f4e75b110fe0bb66a8a851421d6385c4316b9ba WatchSource:0}: Error finding container 7e83039e1bb245b00fe052661f4e75b110fe0bb66a8a851421d6385c4316b9ba: Status 404 returned error can't find the container with id 7e83039e1bb245b00fe052661f4e75b110fe0bb66a8a851421d6385c4316b9ba Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.620167 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.627424 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.647200 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.669042 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.709113 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.720131 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.720164 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.720183 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.720200 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.720210 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.722086 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 23:57:04.781682802 +0000 UTC Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.739824 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.767725 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.802144 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.822631 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.822657 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.822666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.822679 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.822688 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.835764 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.872228 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.886448 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.919683 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b" exitCode=0 Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.919764 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.919795 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"7e83039e1bb245b00fe052661f4e75b110fe0bb66a8a851421d6385c4316b9ba"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.921983 4957 generic.go:334] "Generic (PLEG): container finished" podID="11d94cd0-1619-4ef6-952a-aef84e1cdc75" containerID="9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555" exitCode=0 Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.922061 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerDied","Data":"9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.922089 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerStarted","Data":"d21f663265978d40f1be7a9066dc900eac13445f69495738f3b1c6cc224195c9"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.923658 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerStarted","Data":"d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.923694 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerStarted","Data":"be973d97d5637dba9f626db62dbee50b16fc2ff16d87bff515a2ff8f8dbf7fec"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.924330 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.924363 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.924378 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.924411 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.924427 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:51Z","lastTransitionTime":"2026-01-23T10:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.924915 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61"} Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.927163 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.953064 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:51 crc kubenswrapper[4957]: I0123 10:51:51.967100 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.022764 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.026481 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.026502 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.026510 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.026523 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.026532 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.026543 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.047091 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.087029 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.106969 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.135009 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.135065 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.135075 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.135091 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.135126 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.136946 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.147788 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.187681 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.207194 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.236973 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.237619 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.237650 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.237659 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.237672 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.237682 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.248618 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.267419 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.286906 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.335366 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.340237 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.340299 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.340314 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.340332 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.340344 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.375817 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.416048 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.442336 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.442379 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.442391 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.442409 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.442422 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.457103 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.496578 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.541849 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.544438 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.544512 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.544535 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.544566 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.544587 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.551141 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.551229 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551292 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:51:56.551250302 +0000 UTC m=+26.088502989 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551334 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.551344 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551374 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:56.551360686 +0000 UTC m=+26.088613373 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.551389 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.551409 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551467 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551477 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551481 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551490 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551494 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551500 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551507 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551531 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:56.55152479 +0000 UTC m=+26.088777477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551545 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:56.551539471 +0000 UTC m=+26.088792158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.551579 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:51:56.551558191 +0000 UTC m=+26.088810878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.573745 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.615627 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.646434 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.646474 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.646483 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.646498 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.646507 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.661705 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.696943 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.723207 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 16:48:03.438207649 +0000 UTC Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.736253 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.748755 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.748794 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.748805 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.748820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.748829 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.769646 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.769646 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.769763 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.769823 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.769656 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:52 crc kubenswrapper[4957]: E0123 10:51:52.769883 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.777557 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.816180 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.850882 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.850920 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.850931 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.850952 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.850973 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.856328 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.896016 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.930137 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.930178 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.930188 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.930197 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.930205 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.930214 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.931855 4957 generic.go:334] "Generic (PLEG): container finished" podID="11d94cd0-1619-4ef6-952a-aef84e1cdc75" containerID="90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707" exitCode=0 Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.932382 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerDied","Data":"90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.954591 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.957165 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.957196 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.957206 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.957219 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.957227 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:52Z","lastTransitionTime":"2026-01-23T10:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:52 crc kubenswrapper[4957]: I0123 10:51:52.989546 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.015472 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.055695 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.060791 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.060820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.060829 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.061012 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.061027 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.102687 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.135628 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.163585 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.163620 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.163631 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.163646 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.163657 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.173742 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.218670 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.259207 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.265674 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.265701 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.265709 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.265723 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.265734 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.298136 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.336512 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.368024 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.368079 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.368096 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.368119 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.368136 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.375756 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.417515 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.460982 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.470655 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.470690 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.470698 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.470712 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.470721 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.496832 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.532405 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.573075 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.573300 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.573338 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.573349 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.573363 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.573374 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.618266 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.662767 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.675333 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.675373 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.675386 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.675404 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.675417 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.700116 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.723849 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 23:02:28.887752495 +0000 UTC Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.733803 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.772391 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.777923 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.777978 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.777995 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.778018 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.778034 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.815754 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.858986 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.880737 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.880765 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.880774 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.880799 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.880808 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.897691 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.936571 4957 generic.go:334] "Generic (PLEG): container finished" podID="11d94cd0-1619-4ef6-952a-aef84e1cdc75" containerID="e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7" exitCode=0 Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.936619 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerDied","Data":"e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7"} Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.938142 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.980179 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.983387 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.983424 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.983437 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.983454 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:53 crc kubenswrapper[4957]: I0123 10:51:53.983464 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:53Z","lastTransitionTime":"2026-01-23T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.015814 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.054778 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.086125 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.086189 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.086201 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.086218 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.086230 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.099101 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.134067 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.173167 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.188097 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.188128 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.188147 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.188163 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.188172 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.214079 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.255146 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.290679 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.290715 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.290728 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.290744 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.290755 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.295966 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.335479 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.378753 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.392833 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.392891 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.392907 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.392930 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.392948 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.419739 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.466386 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.495794 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.495863 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.495884 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.495910 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.495928 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.504888 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.535643 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.581420 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.598340 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.598414 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.598436 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.598463 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.598483 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.622136 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.664515 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.699989 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.701613 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.701677 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.701692 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.701716 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.701731 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.724273 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 02:59:27.80504219 +0000 UTC Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.738047 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.769247 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:54 crc kubenswrapper[4957]: E0123 10:51:54.769450 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.769252 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:54 crc kubenswrapper[4957]: E0123 10:51:54.769553 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.769760 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:54 crc kubenswrapper[4957]: E0123 10:51:54.769871 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.781031 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.803679 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.803722 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.803734 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.803752 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.803765 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.821850 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.856521 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.905813 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.905849 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.905861 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.905898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.905909 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:54Z","lastTransitionTime":"2026-01-23T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.943576 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.946272 4957 generic.go:334] "Generic (PLEG): container finished" podID="11d94cd0-1619-4ef6-952a-aef84e1cdc75" containerID="26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c" exitCode=0 Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.946321 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerDied","Data":"26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c"} Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.965518 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.981816 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:54 crc kubenswrapper[4957]: I0123 10:51:54.997832 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:54Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.008597 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.008668 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.008692 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.008725 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.008747 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.016457 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.059047 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.095477 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.113283 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.113329 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.113338 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.113352 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.113361 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.150511 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.182830 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.216057 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.216090 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.216100 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.216141 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.216153 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.217223 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.256211 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.296408 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.318608 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.318648 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.318660 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.318675 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.318686 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.342330 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.377034 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.420238 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.421432 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.421455 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.421466 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.421479 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.421488 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.457977 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.523748 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.523792 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.523801 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.523816 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.523829 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.626587 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.626661 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.626684 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.626713 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.626734 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.725373 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:31:03.943523611 +0000 UTC Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.730084 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.730119 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.730130 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.730143 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.730153 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.832884 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.832954 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.832981 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.833009 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.833030 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.936163 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.936202 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.936213 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.936229 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.936239 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:55Z","lastTransitionTime":"2026-01-23T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.953388 4957 generic.go:334] "Generic (PLEG): container finished" podID="11d94cd0-1619-4ef6-952a-aef84e1cdc75" containerID="7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728" exitCode=0 Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.953481 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerDied","Data":"7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728"} Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.977356 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:55 crc kubenswrapper[4957]: I0123 10:51:55.992511 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:55Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.010084 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.024013 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.038519 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.038582 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.038596 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.038621 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.038638 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.040763 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.062192 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.077009 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.092747 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.112007 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.128117 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.137210 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.141101 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.141137 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.141149 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.141166 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.141178 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.148056 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.169352 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.179637 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.191409 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.243112 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.243166 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.243185 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.243205 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.243217 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.346016 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.346077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.346097 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.346124 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.346141 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.449546 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.449663 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.449689 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.449724 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.449748 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.551636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.551696 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.551714 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.551744 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.551763 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.586784 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.586940 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587024 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:52:04.586980011 +0000 UTC m=+34.124232748 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587094 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.587133 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587172 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:04.587147835 +0000 UTC m=+34.124400562 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.587243 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587264 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.587383 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587458 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587621 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587316 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587560 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587748 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587784 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587659 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:04.587648528 +0000 UTC m=+34.124901225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587825 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:04.587815952 +0000 UTC m=+34.125068649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.587838 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:04.587832213 +0000 UTC m=+34.125084910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.653792 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.653823 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.653832 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.653844 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.653853 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.726352 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:58:06.207372008 +0000 UTC Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.756635 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.756659 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.756667 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.756680 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.756691 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.769462 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.769462 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.769584 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.769687 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.769869 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:51:56 crc kubenswrapper[4957]: E0123 10:51:56.769966 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.859362 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.859398 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.859407 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.859420 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.859429 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.963001 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.963459 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.963705 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.963966 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.964156 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:56Z","lastTransitionTime":"2026-01-23T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.972785 4957 generic.go:334] "Generic (PLEG): container finished" podID="11d94cd0-1619-4ef6-952a-aef84e1cdc75" containerID="b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad" exitCode=0 Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.972864 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerDied","Data":"b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad"} Jan 23 10:51:56 crc kubenswrapper[4957]: I0123 10:51:56.992332 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:56Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.018636 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.036717 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.054143 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.071400 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.071451 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.071467 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.071491 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.071507 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.075162 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.088999 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.101873 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.114538 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.128501 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.145001 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.157619 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.168615 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.173054 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.173093 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.173105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.173156 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.173168 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.177377 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.212208 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.222663 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:57Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.276885 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.276921 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.276929 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.276943 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.276952 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.380971 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.381029 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.381049 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.381074 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.381088 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.484134 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.484167 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.484179 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.484197 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.484210 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.587277 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.587341 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.587354 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.587372 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.587387 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.690827 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.690868 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.690879 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.690898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.690911 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.726535 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 11:51:38.642965939 +0000 UTC Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.793018 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.793053 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.793062 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.793078 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.793091 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.896603 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.896652 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.896661 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.896677 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.896686 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.980654 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" event={"ID":"11d94cd0-1619-4ef6-952a-aef84e1cdc75","Type":"ContainerStarted","Data":"fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.988161 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938"} Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.988840 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.988921 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.999592 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.999626 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.999637 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.999654 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:57 crc kubenswrapper[4957]: I0123 10:51:57.999664 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:57Z","lastTransitionTime":"2026-01-23T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.005800 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.018900 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.020405 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.021696 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.036056 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.050695 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.070944 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.100411 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.102202 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.102243 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.102258 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.102274 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.102324 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.119689 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.132470 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.146922 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.160918 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.175118 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.188651 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.205837 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.205876 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.205887 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.205906 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.205918 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.206527 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.223093 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.239321 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.255851 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.272683 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.295288 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.308228 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.308269 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.308312 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.308330 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.308342 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.314740 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.337133 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.355167 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.375002 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.392369 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.410865 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.410899 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.410910 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.410927 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.410939 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.422930 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.444493 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.459607 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.475196 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.510905 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.513348 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.513412 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.513429 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.513456 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.513474 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.532093 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.548164 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:51:58Z is after 2025-08-24T17:21:41Z" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.616588 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.616650 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.616667 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.616694 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.616711 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.720565 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.720640 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.720663 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.720694 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.720718 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.727342 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 16:39:53.391878482 +0000 UTC Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.769037 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.769148 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:51:58 crc kubenswrapper[4957]: E0123 10:51:58.769326 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.769370 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:51:58 crc kubenswrapper[4957]: E0123 10:51:58.769499 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:51:58 crc kubenswrapper[4957]: E0123 10:51:58.769761 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.823711 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.823801 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.823820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.823844 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.823863 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.926859 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.926930 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.926952 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.926982 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.927004 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:58Z","lastTransitionTime":"2026-01-23T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:58 crc kubenswrapper[4957]: I0123 10:51:58.991772 4957 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.029324 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.029352 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.029359 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.029372 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.029380 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.131636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.131679 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.131690 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.131704 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.131714 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.234392 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.234429 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.234458 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.234472 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.234481 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.336732 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.336770 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.336779 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.336795 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.336805 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.439909 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.439956 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.439970 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.439990 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.440003 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.544225 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.544283 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.544326 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.544351 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.544368 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.647654 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.647689 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.647701 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.647717 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.647728 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.728016 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 23:42:48.131809152 +0000 UTC Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.750790 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.750837 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.750853 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.750874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.750889 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.853870 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.853928 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.853946 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.853970 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.853987 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.957569 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.957637 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.957656 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.957684 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.957703 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:51:59Z","lastTransitionTime":"2026-01-23T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:51:59 crc kubenswrapper[4957]: I0123 10:51:59.995954 4957 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.060111 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.060172 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.060189 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.060213 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.060230 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.163812 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.163883 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.163907 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.163938 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.163961 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.266736 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.266799 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.266817 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.266842 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.266859 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.370210 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.370271 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.370326 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.370366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.370403 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.473351 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.473407 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.473418 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.473445 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.473466 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.576563 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.576618 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.576629 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.576675 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.576688 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.679870 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.680011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.680038 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.680070 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.680100 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.729505 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 14:07:43.986565842 +0000 UTC Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.769355 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:00 crc kubenswrapper[4957]: E0123 10:52:00.769842 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.769883 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.769947 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:00 crc kubenswrapper[4957]: E0123 10:52:00.770032 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:00 crc kubenswrapper[4957]: E0123 10:52:00.770081 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.783014 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.783057 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.783070 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.783087 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.783099 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.795566 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.808334 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.818598 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.833365 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.849013 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.870606 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.884928 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.885394 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.885442 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.885460 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.885486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.885501 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.899547 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.916813 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.931213 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.947192 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.957704 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.968570 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.982143 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.987002 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.987056 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.987068 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.987083 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.987094 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:00Z","lastTransitionTime":"2026-01-23T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:00 crc kubenswrapper[4957]: I0123 10:52:00.999595 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/0.log" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.002184 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938" exitCode=1 Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.002219 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.002867 4957 scope.go:117] "RemoveContainer" containerID="d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.009691 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.029483 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.044368 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.055132 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.067661 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.079409 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.089371 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.089400 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.089409 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.089422 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.089430 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.090101 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.102875 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.115072 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.125665 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.140415 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.154130 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.182699 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:00Z\\\",\\\"message\\\":\\\"ssqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:00.255642 6262 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0123 10:52:00.256029 6262 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0123 10:52:00.256067 6262 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0123 10:52:00.256090 6262 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:00.256112 6262 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:00.256120 6262 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:00.256141 6262 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:00.256177 6262 factory.go:656] Stopping watch factory\\\\nI0123 10:52:00.256195 6262 ovnkube.go:599] Stopped ovnkube\\\\nI0123 10:52:00.256233 6262 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0123 10:52:00.256256 6262 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:00.256271 6262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0123 10:52:00.256300 6262 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0123 10:52:00.256331 6262 handler.go:208] Removed *v1.Pod event handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.191575 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.191666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.191688 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.191717 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.191739 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.237090 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.255502 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.266760 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.294212 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.294236 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.294246 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.294259 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.294267 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.396781 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.396825 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.396841 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.396862 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.396877 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.499353 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.499388 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.499399 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.499414 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.499423 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.601042 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.601085 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.601093 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.601113 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.601127 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.702691 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.702740 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.702752 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.702771 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.702783 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.730493 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 09:31:15.338751579 +0000 UTC Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.805099 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.805149 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.805161 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.805180 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.805191 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.908596 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.908652 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.908670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.908691 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.908707 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.987845 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.987881 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.987890 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.987906 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:01 crc kubenswrapper[4957]: I0123 10:52:01.987915 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:01Z","lastTransitionTime":"2026-01-23T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.000051 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.003659 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.003687 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.003695 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.003708 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.003718 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.007429 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/0.log" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.009537 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.009636 4957 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.017547 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.021050 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.021078 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.021086 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.021100 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.021111 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.028708 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.031452 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.034993 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.035028 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.035038 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.035054 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.035066 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.041370 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.046604 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.049139 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.049171 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.049180 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.049195 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.049204 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.051149 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.064219 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.065050 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.065184 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.066574 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.066611 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.066623 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.066641 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.066653 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.074819 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.098038 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.113876 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.127435 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.145740 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.162951 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.173424 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.173457 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.173466 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.173481 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.173491 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.184073 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.198485 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.210593 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.225349 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.248475 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:00Z\\\",\\\"message\\\":\\\"ssqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:00.255642 6262 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0123 10:52:00.256029 6262 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0123 10:52:00.256067 6262 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0123 10:52:00.256090 6262 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:00.256112 6262 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:00.256120 6262 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:00.256141 6262 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:00.256177 6262 factory.go:656] Stopping watch factory\\\\nI0123 10:52:00.256195 6262 ovnkube.go:599] Stopped ovnkube\\\\nI0123 10:52:00.256233 6262 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0123 10:52:00.256256 6262 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:00.256271 6262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0123 10:52:00.256300 6262 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0123 10:52:00.256331 6262 handler.go:208] Removed *v1.Pod event handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:02Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.276940 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.277192 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.277436 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.277650 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.277871 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.381337 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.381375 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.381387 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.381403 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.381414 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.484681 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.484799 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.484823 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.484853 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.484874 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.587592 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.587663 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.587686 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.587710 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.587738 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.690823 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.690882 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.690907 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.690940 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.690965 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.731108 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:22:18.079623946 +0000 UTC Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.769236 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.769253 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.769465 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.769680 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.769789 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:02 crc kubenswrapper[4957]: E0123 10:52:02.769909 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.793876 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.793947 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.793970 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.793999 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.794021 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.897841 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.897891 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.897914 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.897944 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.897970 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:02Z","lastTransitionTime":"2026-01-23T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.993272 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq"] Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.994105 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.997978 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 23 10:52:02 crc kubenswrapper[4957]: I0123 10:52:02.998076 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.000988 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.001045 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.001072 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.001102 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.001127 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.015358 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/1.log" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.020337 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/0.log" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.021502 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.029302 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5" exitCode=1 Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.029320 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.029410 4957 scope.go:117] "RemoveContainer" containerID="d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.030320 4957 scope.go:117] "RemoveContainer" containerID="d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5" Jan 23 10:52:03 crc kubenswrapper[4957]: E0123 10:52:03.030541 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.034629 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.050538 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.061001 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55l78\" (UniqueName: \"kubernetes.io/projected/340bb9e5-0a20-4377-acf7-aba4b7788153-kube-api-access-55l78\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.061060 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/340bb9e5-0a20-4377-acf7-aba4b7788153-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.061080 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/340bb9e5-0a20-4377-acf7-aba4b7788153-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.061099 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/340bb9e5-0a20-4377-acf7-aba4b7788153-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.071772 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.089115 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:00Z\\\",\\\"message\\\":\\\"ssqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:00.255642 6262 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0123 10:52:00.256029 6262 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0123 10:52:00.256067 6262 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0123 10:52:00.256090 6262 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:00.256112 6262 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:00.256120 6262 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:00.256141 6262 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:00.256177 6262 factory.go:656] Stopping watch factory\\\\nI0123 10:52:00.256195 6262 ovnkube.go:599] Stopped ovnkube\\\\nI0123 10:52:00.256233 6262 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0123 10:52:00.256256 6262 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:00.256271 6262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0123 10:52:00.256300 6262 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0123 10:52:00.256331 6262 handler.go:208] Removed *v1.Pod event handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.102704 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.103998 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.104031 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.104039 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.104054 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.104064 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.132598 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.144047 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.154435 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.162668 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/340bb9e5-0a20-4377-acf7-aba4b7788153-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.162796 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/340bb9e5-0a20-4377-acf7-aba4b7788153-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.162872 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/340bb9e5-0a20-4377-acf7-aba4b7788153-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.162976 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55l78\" (UniqueName: \"kubernetes.io/projected/340bb9e5-0a20-4377-acf7-aba4b7788153-kube-api-access-55l78\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.163909 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/340bb9e5-0a20-4377-acf7-aba4b7788153-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.164376 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/340bb9e5-0a20-4377-acf7-aba4b7788153-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.170419 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/340bb9e5-0a20-4377-acf7-aba4b7788153-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.183865 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.192395 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55l78\" (UniqueName: \"kubernetes.io/projected/340bb9e5-0a20-4377-acf7-aba4b7788153-kube-api-access-55l78\") pod \"ovnkube-control-plane-749d76644c-r9rkq\" (UID: \"340bb9e5-0a20-4377-acf7-aba4b7788153\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.203036 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.207353 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.207381 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.207391 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.207407 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.207419 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.225583 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.243115 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.263812 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.283658 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.303203 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.310971 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.311534 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.311616 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.311633 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.311655 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.311704 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.320151 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.336255 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.355952 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.373007 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.393017 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.414519 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.414551 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.414562 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.414578 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.414590 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.424395 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:00Z\\\",\\\"message\\\":\\\"ssqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:00.255642 6262 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0123 10:52:00.256029 6262 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0123 10:52:00.256067 6262 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0123 10:52:00.256090 6262 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:00.256112 6262 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:00.256120 6262 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:00.256141 6262 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:00.256177 6262 factory.go:656] Stopping watch factory\\\\nI0123 10:52:00.256195 6262 ovnkube.go:599] Stopped ovnkube\\\\nI0123 10:52:00.256233 6262 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0123 10:52:00.256256 6262 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:00.256271 6262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0123 10:52:00.256300 6262 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0123 10:52:00.256331 6262 handler.go:208] Removed *v1.Pod event handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.454973 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.473518 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.489228 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.509805 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.518021 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.518067 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.518082 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.518100 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.518114 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.528630 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.546314 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.561623 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.580836 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.599586 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.624369 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.624410 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.624421 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.624437 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.624450 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.635316 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:03Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.727694 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.728017 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.728165 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.728390 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.728550 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.731667 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:06:44.612490331 +0000 UTC Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.831935 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.831979 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.831991 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.832008 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.832020 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.935325 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.935586 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.935673 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.935766 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:03 crc kubenswrapper[4957]: I0123 10:52:03.935893 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:03Z","lastTransitionTime":"2026-01-23T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.038229 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" event={"ID":"340bb9e5-0a20-4377-acf7-aba4b7788153","Type":"ContainerStarted","Data":"0c9bd76958ab96a9f7dea5e750c24613529ccbb94592ebfa6aa172106685be38"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.038301 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.038335 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.038347 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.038365 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.038376 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.141746 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.141794 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.141812 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.141836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.141853 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.245235 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.245323 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.245343 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.245406 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.245429 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.348496 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.348558 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.348576 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.348603 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.348623 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.451582 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.451620 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.451630 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.451646 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.451657 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.528081 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5fxfb"] Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.528676 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.528785 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.544478 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.553259 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.553682 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.553759 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.553870 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.553960 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.560308 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.578351 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrj7n\" (UniqueName: \"kubernetes.io/projected/87775b38-0664-48f6-8857-7568c135bd79-kube-api-access-wrj7n\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.578529 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.578605 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.591045 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.604404 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.616726 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.627945 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.641384 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.656203 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.656233 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.656242 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.656255 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.656265 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.663020 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:00Z\\\",\\\"message\\\":\\\"ssqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:00.255642 6262 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0123 10:52:00.256029 6262 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0123 10:52:00.256067 6262 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0123 10:52:00.256090 6262 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:00.256112 6262 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:00.256120 6262 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:00.256141 6262 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:00.256177 6262 factory.go:656] Stopping watch factory\\\\nI0123 10:52:00.256195 6262 ovnkube.go:599] Stopped ovnkube\\\\nI0123 10:52:00.256233 6262 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0123 10:52:00.256256 6262 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:00.256271 6262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0123 10:52:00.256300 6262 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0123 10:52:00.256331 6262 handler.go:208] Removed *v1.Pod event handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.676390 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.679186 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.679318 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrj7n\" (UniqueName: \"kubernetes.io/projected/87775b38-0664-48f6-8857-7568c135bd79-kube-api-access-wrj7n\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.679344 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679408 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:52:20.679379403 +0000 UTC m=+50.216632130 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679436 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679493 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:05.179476696 +0000 UTC m=+34.716729383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.679512 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.679541 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.679560 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.679582 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679611 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679648 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:20.67963931 +0000 UTC m=+50.216891997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679662 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679673 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679683 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679699 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679709 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:20.679702222 +0000 UTC m=+50.216955029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679712 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679729 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679744 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679759 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:20.679750663 +0000 UTC m=+50.217003490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.679777 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:20.679769023 +0000 UTC m=+50.217021860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.700656 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrj7n\" (UniqueName: \"kubernetes.io/projected/87775b38-0664-48f6-8857-7568c135bd79-kube-api-access-wrj7n\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.702954 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.718015 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.731196 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.732215 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 10:11:35.137592733 +0000 UTC Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.746277 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.758677 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.758710 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.758721 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.758736 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.758746 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.771630 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.771734 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.771997 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.772040 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.772076 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:04 crc kubenswrapper[4957]: E0123 10:52:04.772111 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.772800 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.792881 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.801719 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.861215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.861261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.861272 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.861321 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.861333 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.963925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.963983 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.964003 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.964025 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:04 crc kubenswrapper[4957]: I0123 10:52:04.964042 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:04Z","lastTransitionTime":"2026-01-23T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.044060 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" event={"ID":"340bb9e5-0a20-4377-acf7-aba4b7788153","Type":"ContainerStarted","Data":"19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.044102 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" event={"ID":"340bb9e5-0a20-4377-acf7-aba4b7788153","Type":"ContainerStarted","Data":"ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.047999 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/1.log" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.066970 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.066995 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.067004 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.067017 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.067027 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.078976 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.092516 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.103706 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.121788 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.141377 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.160821 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.170129 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.170190 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.170207 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.170233 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.170253 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.183592 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:05 crc kubenswrapper[4957]: E0123 10:52:05.183777 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:05 crc kubenswrapper[4957]: E0123 10:52:05.183882 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:06.183853295 +0000 UTC m=+35.721106022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.183852 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.202848 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.227445 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.255137 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.273084 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.273174 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.273217 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.273229 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.273250 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.273265 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.288483 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.311621 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:00Z\\\",\\\"message\\\":\\\"ssqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:00.255642 6262 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0123 10:52:00.256029 6262 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0123 10:52:00.256067 6262 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0123 10:52:00.256090 6262 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:00.256112 6262 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:00.256120 6262 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:00.256141 6262 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:00.256177 6262 factory.go:656] Stopping watch factory\\\\nI0123 10:52:00.256195 6262 ovnkube.go:599] Stopped ovnkube\\\\nI0123 10:52:00.256233 6262 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0123 10:52:00.256256 6262 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:00.256271 6262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0123 10:52:00.256300 6262 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0123 10:52:00.256331 6262 handler.go:208] Removed *v1.Pod event handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.327414 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.345147 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.357847 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.372041 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.376549 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.376628 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.376657 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.376688 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.376706 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.478906 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.478983 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.479004 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.479031 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.479055 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.582342 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.582415 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.582439 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.582471 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.582493 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.684816 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.684858 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.684871 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.684888 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.684901 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.732818 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 18:43:31.270558861 +0000 UTC Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.788247 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.788366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.788391 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.788418 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.788436 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.798862 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.816601 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.830026 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.845435 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.863264 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.891540 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.891589 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.891605 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.891633 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.891650 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.900171 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87256dede5212cdcd07157c02087d16643332173910f7686c8b8fcaff9ed938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:00Z\\\",\\\"message\\\":\\\"ssqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:00.255642 6262 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0123 10:52:00.256029 6262 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0123 10:52:00.256067 6262 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0123 10:52:00.256090 6262 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:00.256112 6262 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:00.256120 6262 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:00.256141 6262 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:00.256177 6262 factory.go:656] Stopping watch factory\\\\nI0123 10:52:00.256195 6262 ovnkube.go:599] Stopped ovnkube\\\\nI0123 10:52:00.256233 6262 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0123 10:52:00.256256 6262 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:00.256271 6262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0123 10:52:00.256300 6262 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0123 10:52:00.256331 6262 handler.go:208] Removed *v1.Pod event handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.914993 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.947606 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.963646 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.976862 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.990928 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:05Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.993459 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.993492 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.993504 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.993520 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:05 crc kubenswrapper[4957]: I0123 10:52:05.993531 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:05Z","lastTransitionTime":"2026-01-23T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.003877 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:06Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.019164 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:06Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.035963 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:06Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.055444 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:06Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.073256 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:06Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.093796 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:06Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.095381 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.095408 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.095417 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.095432 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.095444 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.107086 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:06Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.193206 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:06 crc kubenswrapper[4957]: E0123 10:52:06.193491 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:06 crc kubenswrapper[4957]: E0123 10:52:06.193659 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:08.193620262 +0000 UTC m=+37.730872999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.198223 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.198315 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.198343 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.198374 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.198397 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.300945 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.300992 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.301004 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.301022 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.301034 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.403421 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.403575 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.403607 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.403636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.403658 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.507362 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.507466 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.507498 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.507531 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.507554 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.610832 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.610892 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.610915 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.610944 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.610968 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.713570 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.713625 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.713644 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.713665 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.713680 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.733452 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 09:55:30.180115697 +0000 UTC Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.769421 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.769459 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.769477 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.769430 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:06 crc kubenswrapper[4957]: E0123 10:52:06.769604 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:06 crc kubenswrapper[4957]: E0123 10:52:06.769686 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:06 crc kubenswrapper[4957]: E0123 10:52:06.769764 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:06 crc kubenswrapper[4957]: E0123 10:52:06.769915 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.815935 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.816001 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.816027 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.816062 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.816086 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.918369 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.918437 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.918458 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.918488 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:06 crc kubenswrapper[4957]: I0123 10:52:06.918511 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:06Z","lastTransitionTime":"2026-01-23T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.020841 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.020926 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.020939 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.020958 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.020969 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.123908 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.123948 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.123956 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.123972 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.123984 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.227505 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.227559 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.227571 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.227589 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.227604 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.330540 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.330596 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.330613 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.330636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.330653 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.432766 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.432824 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.432839 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.432863 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.432882 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.536713 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.536788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.536813 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.536843 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.536880 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.640578 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.640656 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.640679 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.640708 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.640732 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.734413 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 19:05:21.474454444 +0000 UTC Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.744011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.744075 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.744097 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.744124 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.744145 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.847921 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.847962 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.847979 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.848001 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.848018 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.951510 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.951603 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.951626 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.951661 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:07 crc kubenswrapper[4957]: I0123 10:52:07.951684 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:07Z","lastTransitionTime":"2026-01-23T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.054743 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.055087 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.055216 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.055389 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.055514 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.158944 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.159001 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.159020 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.159044 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.159062 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.215804 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:08 crc kubenswrapper[4957]: E0123 10:52:08.215982 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:08 crc kubenswrapper[4957]: E0123 10:52:08.216055 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:12.216031102 +0000 UTC m=+41.753283819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.261970 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.262033 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.262052 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.262079 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.262096 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.365845 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.365971 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.365989 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.366016 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.366036 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.469045 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.469105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.469123 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.469144 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.469161 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.571741 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.571799 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.571811 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.571828 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.571842 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.674536 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.674616 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.674640 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.674670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.674693 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.735052 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 10:10:02.114132578 +0000 UTC Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.769543 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.769643 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.769665 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:08 crc kubenswrapper[4957]: E0123 10:52:08.769839 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.769905 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:08 crc kubenswrapper[4957]: E0123 10:52:08.769951 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:08 crc kubenswrapper[4957]: E0123 10:52:08.770086 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:08 crc kubenswrapper[4957]: E0123 10:52:08.770231 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.776730 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.776765 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.776777 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.776793 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.776805 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.879844 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.879917 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.879939 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.879968 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.879991 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.983439 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.983660 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.983675 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.983690 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:08 crc kubenswrapper[4957]: I0123 10:52:08.983728 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:08Z","lastTransitionTime":"2026-01-23T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.086932 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.087011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.087036 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.087074 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.087102 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.190116 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.190258 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.190321 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.190350 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.190366 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.293248 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.293358 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.293382 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.293410 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.293435 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.397247 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.397363 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.397387 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.397415 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.397436 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.502221 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.502311 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.502324 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.502346 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.502359 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.527612 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.529062 4957 scope.go:117] "RemoveContainer" containerID="d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5" Jan 23 10:52:09 crc kubenswrapper[4957]: E0123 10:52:09.529435 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.561612 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.577746 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.592558 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.605025 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.605121 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.605226 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.605252 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.605333 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.606836 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.620295 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.635202 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.651668 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.674332 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.694024 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.709343 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.709485 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.709532 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.709563 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.709582 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.715339 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.731242 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.735428 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 10:13:56.469368794 +0000 UTC Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.753271 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.767830 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.784470 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.800067 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.810835 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.813634 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.813665 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.813675 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.813689 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.813698 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.825148 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:09Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.918350 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.918401 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.918413 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.918433 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:09 crc kubenswrapper[4957]: I0123 10:52:09.918445 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:09Z","lastTransitionTime":"2026-01-23T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.021553 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.021595 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.021605 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.021620 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.021631 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.124561 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.124608 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.124620 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.124641 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.124652 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.227702 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.227782 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.227801 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.227856 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.227873 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.330654 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.330698 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.330716 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.330735 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.330745 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.433699 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.433763 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.433788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.433817 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.433839 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.536836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.536912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.536931 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.536959 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.536977 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.639397 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.639455 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.639476 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.639502 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.639517 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.736605 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 07:41:02.109044974 +0000 UTC Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.741514 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.741569 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.741581 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.741600 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.741611 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.769152 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.769161 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.769252 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:10 crc kubenswrapper[4957]: E0123 10:52:10.769679 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.769701 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:10 crc kubenswrapper[4957]: E0123 10:52:10.769820 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:10 crc kubenswrapper[4957]: E0123 10:52:10.769976 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:10 crc kubenswrapper[4957]: E0123 10:52:10.770180 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.789662 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.801046 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.811545 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.825331 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.839503 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.843761 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.843809 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.843840 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.843859 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.843875 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.852957 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.866875 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.881329 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.891919 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.904060 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.914685 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.925709 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.935343 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.945914 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.945981 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.946011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.946023 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.946042 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.946055 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:10Z","lastTransitionTime":"2026-01-23T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.961778 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.981081 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:10 crc kubenswrapper[4957]: I0123 10:52:10.993773 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:10Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.048380 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.048427 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.048436 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.048451 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.048460 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.151181 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.151258 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.151313 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.151360 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.151379 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.254408 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.254452 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.254463 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.254480 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.254493 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.357232 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.357343 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.357362 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.357388 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.357406 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.460274 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.460572 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.460661 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.460754 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.460831 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.563579 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.563823 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.563952 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.564083 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.564230 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.666659 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.667422 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.667447 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.667465 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.667479 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.737304 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 20:27:10.150646491 +0000 UTC Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.769798 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.769878 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.769902 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.769933 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.769960 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.872264 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.872587 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.872670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.872745 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.872822 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.975117 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.975151 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.975159 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.975172 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:11 crc kubenswrapper[4957]: I0123 10:52:11.975180 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:11Z","lastTransitionTime":"2026-01-23T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.077173 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.077426 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.077562 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.077644 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.077710 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.180562 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.180663 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.180734 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.180760 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.180777 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.262651 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.262838 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.262910 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:20.262888116 +0000 UTC m=+49.800140833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.283256 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.283560 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.283665 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.283759 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.283847 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.372646 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.372711 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.372734 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.372762 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.372783 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.394554 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:12Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.398909 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.398938 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.398948 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.398962 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.398971 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.416612 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:12Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.420405 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.420489 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.420511 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.420529 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.420566 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.434386 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:12Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.438852 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.438900 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.438960 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.438982 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.439000 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.459721 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:12Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.464995 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.465247 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.465465 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.465677 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.465841 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.486654 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:12Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.487092 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.489768 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.489855 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.489876 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.489928 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.489949 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.593058 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.593350 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.593419 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.593487 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.593576 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.696561 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.696602 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.696615 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.696632 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.696644 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.737986 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 10:22:27.170639903 +0000 UTC Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.769616 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.769745 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.769933 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.770017 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.770070 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.770035 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.770221 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:12 crc kubenswrapper[4957]: E0123 10:52:12.770438 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.799558 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.799634 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.799651 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.799675 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.799697 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.902544 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.902815 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.902841 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.902874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:12 crc kubenswrapper[4957]: I0123 10:52:12.902897 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:12Z","lastTransitionTime":"2026-01-23T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.004893 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.004958 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.004976 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.005003 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.005025 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.108189 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.108236 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.108253 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.108306 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.108323 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.210792 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.210867 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.210885 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.210911 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.210931 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.314122 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.314169 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.314181 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.314202 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.314219 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.417821 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.417894 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.417910 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.418168 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.418197 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.520747 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.520826 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.520847 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.520878 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.520900 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.623603 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.623657 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.623670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.623690 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.623704 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.726371 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.726432 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.726448 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.726472 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.726484 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.738902 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 09:51:17.767356153 +0000 UTC Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.828512 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.828552 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.828564 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.828582 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.828593 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.931890 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.931951 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.931965 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.931989 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:13 crc kubenswrapper[4957]: I0123 10:52:13.932003 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:13Z","lastTransitionTime":"2026-01-23T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.034357 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.034410 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.034424 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.034446 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.034460 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.137266 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.137370 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.137389 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.137419 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.137445 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.241011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.241080 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.241096 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.241122 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.241139 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.344808 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.344889 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.344939 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.344963 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.344980 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.448241 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.448336 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.448358 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.448384 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.448401 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.551133 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.551179 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.551189 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.551205 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.551217 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.654925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.654993 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.655012 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.655037 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.655062 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.739851 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 23:40:29.341409592 +0000 UTC Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.758397 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.758464 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.758493 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.758527 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.758549 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.769894 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.769977 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.770003 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:14 crc kubenswrapper[4957]: E0123 10:52:14.770124 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:14 crc kubenswrapper[4957]: E0123 10:52:14.770212 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.770238 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:14 crc kubenswrapper[4957]: E0123 10:52:14.770345 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:14 crc kubenswrapper[4957]: E0123 10:52:14.770431 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.861059 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.861121 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.861139 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.861164 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.861186 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.965575 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.965626 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.965640 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.965662 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:14 crc kubenswrapper[4957]: I0123 10:52:14.965678 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:14Z","lastTransitionTime":"2026-01-23T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.068032 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.068072 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.068083 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.068103 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.068117 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.171053 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.171096 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.171107 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.171126 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.171138 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.274259 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.274328 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.274339 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.274356 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.274368 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.385410 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.385499 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.385527 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.385567 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.385602 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.489337 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.489395 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.489413 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.489481 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.489499 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.592470 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.592738 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.592778 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.592807 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.592829 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.695405 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.695527 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.695549 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.695572 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.695589 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.741011 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 01:29:15.626335127 +0000 UTC Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.798124 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.798183 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.798205 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.798233 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.798254 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.901138 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.901258 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.901328 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.901363 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:15 crc kubenswrapper[4957]: I0123 10:52:15.901388 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:15Z","lastTransitionTime":"2026-01-23T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.004635 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.004725 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.004756 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.004788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.004817 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.107564 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.107616 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.107635 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.107661 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.107678 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.209932 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.209966 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.209978 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.209994 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.210006 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.313202 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.313308 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.313333 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.313364 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.313386 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.416205 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.416256 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.416267 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.416306 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.416320 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.518868 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.518923 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.518937 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.518961 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.518977 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.621791 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.621845 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.621860 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.621882 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.621896 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.724852 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.724898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.724909 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.724925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.724937 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.741659 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 15:35:13.91295177 +0000 UTC Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.769032 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.769091 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.769114 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:16 crc kubenswrapper[4957]: E0123 10:52:16.769220 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.769238 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:16 crc kubenswrapper[4957]: E0123 10:52:16.769371 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:16 crc kubenswrapper[4957]: E0123 10:52:16.769455 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:16 crc kubenswrapper[4957]: E0123 10:52:16.769499 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.828203 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.828253 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.828266 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.828309 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.828328 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.931585 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.931637 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.931652 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.931673 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:16 crc kubenswrapper[4957]: I0123 10:52:16.931691 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:16Z","lastTransitionTime":"2026-01-23T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.034797 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.034886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.034900 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.034922 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.034938 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.138786 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.138837 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.138848 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.138867 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.138882 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.242340 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.242385 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.242415 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.242437 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.242449 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.345026 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.345119 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.345142 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.345171 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.345196 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.447993 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.448065 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.448077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.448094 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.448106 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.551173 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.551230 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.551243 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.551263 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.551273 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.653074 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.653104 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.653111 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.653125 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.653133 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.742323 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 01:51:13.955016745 +0000 UTC Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.755184 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.755227 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.755238 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.755256 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.755269 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.858046 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.858072 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.858082 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.858096 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.858107 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.961401 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.961445 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.961456 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.961475 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:17 crc kubenswrapper[4957]: I0123 10:52:17.961487 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:17Z","lastTransitionTime":"2026-01-23T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.065939 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.065985 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.065997 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.066018 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.066040 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.169784 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.169836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.169889 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.169912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.169928 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.273366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.273425 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.273443 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.273468 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.273490 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.376923 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.376996 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.377013 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.377036 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.377054 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.479742 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.479822 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.479858 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.479893 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.479916 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.582683 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.582726 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.582735 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.582750 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.582761 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.685477 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.685551 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.685568 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.685592 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.685609 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.742809 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 19:38:08.522977467 +0000 UTC Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.769751 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:18 crc kubenswrapper[4957]: E0123 10:52:18.770188 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.769885 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:18 crc kubenswrapper[4957]: E0123 10:52:18.770413 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.769957 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:18 crc kubenswrapper[4957]: E0123 10:52:18.770563 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.769867 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:18 crc kubenswrapper[4957]: E0123 10:52:18.770704 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.788077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.788136 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.788154 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.788178 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.788195 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.890831 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.890904 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.890925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.890956 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.890978 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.993802 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.993875 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.993895 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.993925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:18 crc kubenswrapper[4957]: I0123 10:52:18.993943 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:18Z","lastTransitionTime":"2026-01-23T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.096486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.096523 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.096536 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.096553 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.096564 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.199213 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.199270 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.199341 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.199376 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.199401 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.302741 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.302810 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.302826 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.302850 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.302866 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.405646 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.405725 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.405752 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.405812 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.405914 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.509320 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.509365 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.509376 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.509393 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.509404 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.612815 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.612871 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.612887 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.612912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.612932 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.714464 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.714502 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.714511 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.714528 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.714541 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.743293 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 20:29:06.286711221 +0000 UTC Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.816974 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.817055 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.817079 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.817112 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.817136 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.920084 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.920148 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.920170 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.920196 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:19 crc kubenswrapper[4957]: I0123 10:52:19.920213 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:19Z","lastTransitionTime":"2026-01-23T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.023997 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.024085 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.024108 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.024132 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.024148 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.127489 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.127563 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.127584 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.127608 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.127625 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.229757 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.229805 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.229815 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.229833 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.229843 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.333402 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.333477 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.333499 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.333529 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.333550 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.352597 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.352797 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.352877 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:36.352853014 +0000 UTC m=+65.890105741 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.436568 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.436622 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.436639 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.436662 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.436678 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.540191 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.540259 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.540320 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.540355 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.540375 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.644138 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.644201 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.644223 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.644253 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.644373 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.744150 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 23:34:23.172410787 +0000 UTC Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.747753 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.747834 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.747861 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.747899 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.747925 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.758354 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.758527 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:52:52.758497358 +0000 UTC m=+82.295750085 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.758638 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.758697 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.758736 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.758785 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.758906 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.758947 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.758987 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.758993 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759013 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759023 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759055 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:52.759022743 +0000 UTC m=+82.296275470 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759063 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759092 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759100 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:52.759076754 +0000 UTC m=+82.296329481 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759137 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:52.759118185 +0000 UTC m=+82.296370912 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.759171 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:52:52.759150576 +0000 UTC m=+82.296403303 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.769466 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.769591 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.769778 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.769831 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.770008 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.770136 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.770940 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:20 crc kubenswrapper[4957]: E0123 10:52:20.769794 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.771591 4957 scope.go:117] "RemoveContainer" containerID="d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.792736 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.814645 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.833125 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.844758 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.851074 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.851128 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.851146 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.851168 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.851184 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.858429 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.873168 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.896602 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.909416 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.930850 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.946169 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.954654 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.954696 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.954712 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.954731 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.954746 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:20Z","lastTransitionTime":"2026-01-23T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.956497 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.973069 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:20 crc kubenswrapper[4957]: I0123 10:52:20.989072 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:20Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.007610 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.022613 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.042565 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.057032 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.057069 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.057085 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.057103 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.057115 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.057773 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.107642 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/1.log" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.111292 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.111805 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.124554 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.150656 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.160920 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.160983 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.161000 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.161025 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.161047 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.170993 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.190773 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.221353 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.239995 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.263536 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.265118 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.265174 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.265199 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.265227 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.265251 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.279958 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.302331 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.314099 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.324264 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.338084 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.354524 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.366963 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.367789 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.367810 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.367820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.367836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.367848 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.387675 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.398834 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.408694 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:21Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.469883 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.469919 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.469928 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.469943 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.469952 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.571639 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.571706 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.571715 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.571728 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.571736 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.674054 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.674099 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.674110 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.674127 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.674141 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.744505 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 15:22:30.0806462 +0000 UTC Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.777013 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.777073 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.777095 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.777123 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.777145 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.882103 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.882159 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.882183 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.882210 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.882230 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.984765 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.984843 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.984861 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.984886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:21 crc kubenswrapper[4957]: I0123 10:52:21.984905 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:21Z","lastTransitionTime":"2026-01-23T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.088196 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.088271 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.088325 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.088353 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.088373 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.124119 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/2.log" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.125346 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/1.log" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.130125 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3" exitCode=1 Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.130208 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.130446 4957 scope.go:117] "RemoveContainer" containerID="d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.131243 4957 scope.go:117] "RemoveContainer" containerID="1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3" Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.131555 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.155745 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.179442 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.190922 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.190976 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.190990 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.191010 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.191022 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.195568 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.211212 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.224734 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.246126 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.278249 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d660c0e57541cf895c30cdd702c50a43e189e0ed85fe800f422cc1646cbb57e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.002619 6388 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.002696 6388 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0123 10:52:02.003246 6388 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0123 10:52:02.003275 6388 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0123 10:52:02.003331 6388 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0123 10:52:02.003358 6388 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0123 10:52:02.003364 6388 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0123 10:52:02.003360 6388 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0123 10:52:02.003378 6388 factory.go:656] Stopping watch factory\\\\nI0123 10:52:02.003385 6388 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0123 10:52:02.003395 6388 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0123 10:52:02.003406 6388 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.293968 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.294018 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.294030 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.294048 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.294059 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.296255 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.324881 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.336808 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.347379 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.360586 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.373662 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.388848 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.396236 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.396301 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.396311 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.396328 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.396339 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.405115 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.428436 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.447342 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.499863 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.499951 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.499976 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.500006 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.500028 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.602368 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.602421 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.602436 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.602456 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.602468 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.705446 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.705586 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.705609 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.705638 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.705656 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.745325 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:37:39.076994068 +0000 UTC Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.769783 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.769985 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.770542 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.770603 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.770687 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.770725 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.770893 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.770978 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.802344 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.802426 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.802453 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.802485 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.802511 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.823457 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.828576 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.828609 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.828618 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.828636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.828644 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.841827 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.845821 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.845859 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.845870 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.845886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.845899 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.864939 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.868545 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.868572 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.868581 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.868593 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.868602 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.882688 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.887581 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.887657 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.887682 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.887714 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.887737 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.902831 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:22Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:22 crc kubenswrapper[4957]: E0123 10:52:22.903056 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.904766 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.904881 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.904914 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.904935 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:22 crc kubenswrapper[4957]: I0123 10:52:22.904948 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:22Z","lastTransitionTime":"2026-01-23T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.007888 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.007967 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.007991 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.008020 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.008043 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.111026 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.111173 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.111198 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.111222 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.111349 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.135560 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/2.log" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.140732 4957 scope.go:117] "RemoveContainer" containerID="1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3" Jan 23 10:52:23 crc kubenswrapper[4957]: E0123 10:52:23.141038 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.160549 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.214357 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.214407 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.214424 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.214451 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.214471 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.219966 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.235818 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.249921 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.262144 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.285013 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.300463 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.313383 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.316967 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.316995 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.317003 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.317016 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.317025 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.327361 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.337888 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.364544 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.378749 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.391775 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.405089 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.417105 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.419313 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.419358 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.419375 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.419397 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.419414 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.439815 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.452768 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:23Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.522629 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.522699 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.522717 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.522744 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.522766 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.626042 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.626120 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.626146 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.626176 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.626199 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.728765 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.728852 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.728878 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.728914 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.728937 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.746211 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 15:58:01.157814167 +0000 UTC Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.832332 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.832444 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.832461 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.832484 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.832498 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.935913 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.935988 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.936006 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.936029 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:23 crc kubenswrapper[4957]: I0123 10:52:23.936046 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:23Z","lastTransitionTime":"2026-01-23T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.038800 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.038851 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.038867 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.038890 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.038907 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.142355 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.142426 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.142488 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.142538 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.142561 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.245596 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.245660 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.245678 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.245706 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.245722 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.348439 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.348496 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.348513 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.348535 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.348553 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.451136 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.451201 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.451219 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.451244 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.451265 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.546831 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.560155 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.560219 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.560238 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.560261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.560324 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.564838 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.590357 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.610154 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.626055 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.644888 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.661467 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.663425 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.663486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.663499 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.663516 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.663873 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.682174 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.702092 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.719183 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.736991 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.746369 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 22:14:47.496431998 +0000 UTC Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.757346 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.766909 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.766963 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.766980 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.767006 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.767026 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.769306 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.769358 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.769442 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:24 crc kubenswrapper[4957]: E0123 10:52:24.769442 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.769459 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:24 crc kubenswrapper[4957]: E0123 10:52:24.769557 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:24 crc kubenswrapper[4957]: E0123 10:52:24.769787 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:24 crc kubenswrapper[4957]: E0123 10:52:24.769876 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.775020 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.793725 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.823359 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.842398 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.859387 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.869866 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.870229 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.870273 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.870347 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.870373 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.871680 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.888739 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:24Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.973114 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.973190 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.973215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.973246 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:24 crc kubenswrapper[4957]: I0123 10:52:24.973267 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:24Z","lastTransitionTime":"2026-01-23T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.076887 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.076942 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.076964 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.076993 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.077019 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.179906 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.179973 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.179985 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.180010 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.180026 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.283626 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.284054 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.284160 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.284261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.284404 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.388400 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.388731 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.388872 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.389013 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.389148 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.492204 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.492275 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.492346 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.492376 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.492401 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.595666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.595717 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.595736 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.595760 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.595778 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.699050 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.699092 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.699105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.699123 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.699135 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.747204 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 00:58:54.599613116 +0000 UTC Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.802022 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.802065 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.802076 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.802091 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.802102 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.905608 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.905677 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.905695 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.905720 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:25 crc kubenswrapper[4957]: I0123 10:52:25.905760 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:25Z","lastTransitionTime":"2026-01-23T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.009193 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.009240 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.009252 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.009269 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.009305 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.112387 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.112467 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.112486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.112517 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.112536 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.215942 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.216010 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.216029 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.216058 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.216080 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.318595 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.318651 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.318670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.318693 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.318710 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.421304 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.421341 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.421350 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.421363 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.421372 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.523900 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.523953 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.523963 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.523994 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.524006 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.626922 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.627023 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.627044 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.627090 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.627117 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.729827 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.729867 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.729878 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.729895 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.729907 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.747339 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 00:23:14.394254951 +0000 UTC Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.769743 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.769752 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.769949 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.769759 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:26 crc kubenswrapper[4957]: E0123 10:52:26.770011 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:26 crc kubenswrapper[4957]: E0123 10:52:26.769902 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:26 crc kubenswrapper[4957]: E0123 10:52:26.770097 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:26 crc kubenswrapper[4957]: E0123 10:52:26.770248 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.832791 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.832825 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.832836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.832851 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.832862 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.934447 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.934488 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.934498 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.934511 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:26 crc kubenswrapper[4957]: I0123 10:52:26.934520 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:26Z","lastTransitionTime":"2026-01-23T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.037046 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.037099 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.037116 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.037137 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.037153 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.139562 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.139621 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.139640 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.139667 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.139685 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.242754 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.242796 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.242807 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.242828 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.242839 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.345434 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.345474 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.345485 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.345501 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.345512 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.448322 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.448386 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.448402 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.448435 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.448471 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.550749 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.550830 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.550847 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.550871 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.550889 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.654162 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.654221 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.654233 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.654253 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.654265 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.747689 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 00:16:55.724395921 +0000 UTC Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.757133 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.757175 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.757185 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.757200 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.757211 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.859663 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.859746 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.859769 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.859803 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.859823 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.963162 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.963213 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.963225 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.963243 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:27 crc kubenswrapper[4957]: I0123 10:52:27.963259 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:27Z","lastTransitionTime":"2026-01-23T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.065742 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.065782 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.065793 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.065809 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.065820 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.168025 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.168083 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.168101 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.168126 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.168187 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.270798 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.270848 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.270864 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.270883 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.270899 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.373612 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.373648 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.373656 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.373669 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.373679 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.476912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.476965 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.476977 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.476995 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.477010 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.579814 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.579896 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.579915 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.579942 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.579959 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.682713 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.682798 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.682835 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.682906 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.682930 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.747904 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 09:25:16.015276781 +0000 UTC Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.769478 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.769509 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:28 crc kubenswrapper[4957]: E0123 10:52:28.769608 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.769478 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.769782 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:28 crc kubenswrapper[4957]: E0123 10:52:28.769899 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:28 crc kubenswrapper[4957]: E0123 10:52:28.770022 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:28 crc kubenswrapper[4957]: E0123 10:52:28.770185 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.784735 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.784995 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.785066 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.785132 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.785189 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.891103 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.891998 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.892099 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.892165 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.892225 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.994926 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.994994 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.995014 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.995041 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:28 crc kubenswrapper[4957]: I0123 10:52:28.995059 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:28Z","lastTransitionTime":"2026-01-23T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.098368 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.098713 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.098859 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.099035 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.099338 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.202199 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.202250 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.202259 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.202274 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.202299 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.305105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.305489 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.305603 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.305759 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.305893 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.409169 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.409216 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.409234 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.409259 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.409277 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.512093 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.512450 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.512636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.512829 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.513012 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.616958 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.617418 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.617581 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.617775 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.618016 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.721150 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.721197 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.721208 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.721225 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.721236 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.748694 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:05:21.626418751 +0000 UTC Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.823584 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.823670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.823693 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.823722 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.823743 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.926901 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.926938 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.926950 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.926968 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:29 crc kubenswrapper[4957]: I0123 10:52:29.926980 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:29Z","lastTransitionTime":"2026-01-23T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.029976 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.030040 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.030062 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.030087 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.030105 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.132510 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.132547 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.132557 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.132573 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.132585 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.235049 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.235110 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.235120 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.235142 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.235153 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.338139 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.338184 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.338195 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.338211 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.338223 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.441098 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.441406 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.441537 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.441674 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.441784 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.545671 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.546154 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.546231 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.546345 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.546446 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.649743 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.650201 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.650717 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.650825 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.650901 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.749774 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 06:17:23.393674904 +0000 UTC Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.753912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.754124 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.754214 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.754323 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.754423 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.769669 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:30 crc kubenswrapper[4957]: E0123 10:52:30.769779 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.769947 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:30 crc kubenswrapper[4957]: E0123 10:52:30.770002 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.769951 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:30 crc kubenswrapper[4957]: E0123 10:52:30.770262 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.770589 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:30 crc kubenswrapper[4957]: E0123 10:52:30.770752 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.781740 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.797078 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.809351 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.821898 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.836465 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.851476 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.858637 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.858679 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.858688 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.858704 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.858714 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.871057 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.886580 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.900125 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.914037 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.924812 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.936671 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.953699 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.961340 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.961387 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.961399 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.961417 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.961430 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:30Z","lastTransitionTime":"2026-01-23T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.973163 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:30 crc kubenswrapper[4957]: I0123 10:52:30.986865 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:30Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.015345 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:31Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.030883 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:31Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.045137 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:31Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.063974 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.064028 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.064045 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.064067 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.064083 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.166617 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.166677 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.166697 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.166723 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.166745 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.269836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.269896 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.269921 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.269950 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.269973 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.372901 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.372969 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.372991 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.373021 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.373042 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.476051 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.476120 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.476141 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.476171 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.476192 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.580034 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.580098 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.580124 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.580154 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.580180 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.682900 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.682930 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.682937 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.682951 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.682960 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.750638 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 03:49:15.527205077 +0000 UTC Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.784808 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.784842 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.784875 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.784887 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.784896 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.887593 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.887656 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.887680 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.887709 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.887730 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.990149 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.990235 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.990255 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.990325 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:31 crc kubenswrapper[4957]: I0123 10:52:31.990354 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:31Z","lastTransitionTime":"2026-01-23T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.093710 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.093789 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.093815 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.093848 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.093872 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.196596 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.196663 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.196686 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.196715 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.196736 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.299425 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.299907 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.300285 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.300563 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.300758 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.403531 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.403595 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.403614 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.403639 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.403658 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.506586 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.506613 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.506625 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.506640 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.506653 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.608803 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.608842 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.608854 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.608871 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.608883 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.711674 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.711720 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.711736 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.711761 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.711778 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.751431 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 07:04:49.658116881 +0000 UTC Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.769953 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.770033 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:32 crc kubenswrapper[4957]: E0123 10:52:32.770126 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.770182 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:32 crc kubenswrapper[4957]: E0123 10:52:32.770226 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:32 crc kubenswrapper[4957]: E0123 10:52:32.770412 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.770724 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:32 crc kubenswrapper[4957]: E0123 10:52:32.771050 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.813966 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.814002 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.814013 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.814029 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.814043 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.916404 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.916563 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.916586 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.916613 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:32 crc kubenswrapper[4957]: I0123 10:52:32.916635 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:32Z","lastTransitionTime":"2026-01-23T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.019689 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.020339 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.020452 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.020567 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.020667 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.123940 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.124008 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.124033 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.124063 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.124085 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.135129 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.135247 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.135274 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.135335 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.135358 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: E0123 10:52:33.157348 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:33Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.162969 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.163124 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.163151 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.163177 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.163199 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: E0123 10:52:33.183081 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:33Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.192961 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.193058 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.193084 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.193117 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.193142 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: E0123 10:52:33.215627 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:33Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.220623 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.220675 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.220699 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.220727 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.220793 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: E0123 10:52:33.240074 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:33Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.245090 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.245316 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.245345 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.245374 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.245395 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: E0123 10:52:33.263006 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:33Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:33 crc kubenswrapper[4957]: E0123 10:52:33.263233 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.265739 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.265802 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.265820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.265845 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.265864 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.369168 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.369226 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.369243 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.369316 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.369353 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.471842 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.471905 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.471922 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.471944 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.471964 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.574568 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.574606 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.574616 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.574631 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.574641 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.677617 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.677688 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.677716 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.677745 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.677770 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.752099 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 00:20:58.823234959 +0000 UTC Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.780792 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.780848 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.780865 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.780889 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.780907 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.883882 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.883920 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.883930 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.883945 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.883958 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.987632 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.987889 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.987908 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.987933 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:33 crc kubenswrapper[4957]: I0123 10:52:33.987950 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:33Z","lastTransitionTime":"2026-01-23T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.090739 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.090811 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.090830 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.090853 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.090888 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.193880 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.193932 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.193952 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.193975 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.193993 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.296490 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.296538 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.296554 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.296576 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.296593 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.398262 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.398324 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.398334 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.398351 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.398361 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.500753 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.500799 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.500810 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.500826 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.500838 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.603195 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.603235 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.603244 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.603258 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.603267 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.704861 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.704896 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.704905 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.704919 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.704929 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.752901 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 15:41:47.888639711 +0000 UTC Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.769756 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.769795 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.769778 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.769765 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:34 crc kubenswrapper[4957]: E0123 10:52:34.769917 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:34 crc kubenswrapper[4957]: E0123 10:52:34.770047 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:34 crc kubenswrapper[4957]: E0123 10:52:34.770220 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:34 crc kubenswrapper[4957]: E0123 10:52:34.770455 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.807442 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.807490 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.807503 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.807520 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.807532 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.910311 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.910366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.910386 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.910428 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:34 crc kubenswrapper[4957]: I0123 10:52:34.910447 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:34Z","lastTransitionTime":"2026-01-23T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.013081 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.013132 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.013144 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.013162 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.013174 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.116469 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.116519 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.116537 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.116559 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.116575 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.218856 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.218896 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.218904 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.218919 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.218930 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.321136 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.321186 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.321200 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.321219 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.321230 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.423948 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.424008 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.424023 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.424043 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.424058 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.526602 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.526642 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.526654 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.526670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.526682 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.629122 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.629156 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.629167 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.629183 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.629193 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.731778 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.731820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.731831 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.731847 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.731858 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.753193 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 06:51:56.52503077 +0000 UTC Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.834540 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.834574 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.834585 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.834602 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.834615 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.937016 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.937065 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.937077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.937099 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:35 crc kubenswrapper[4957]: I0123 10:52:35.937112 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:35Z","lastTransitionTime":"2026-01-23T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.038890 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.038921 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.038934 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.038955 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.038970 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.141163 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.141193 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.141203 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.141255 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.141268 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.243241 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.243263 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.243290 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.243304 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.243314 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.345105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.345139 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.345148 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.345165 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.345176 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.416818 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:36 crc kubenswrapper[4957]: E0123 10:52:36.416968 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:36 crc kubenswrapper[4957]: E0123 10:52:36.417022 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:08.41700458 +0000 UTC m=+97.954257267 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.447638 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.447729 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.447830 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.447856 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.447874 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.550583 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.550624 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.550635 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.550653 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.550664 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.653516 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.653568 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.653577 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.653593 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.653603 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.754134 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:55:08.85378318 +0000 UTC Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.755842 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.755899 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.755912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.755931 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.755943 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.769156 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.769159 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.769209 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.769215 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:36 crc kubenswrapper[4957]: E0123 10:52:36.769331 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:36 crc kubenswrapper[4957]: E0123 10:52:36.769707 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:36 crc kubenswrapper[4957]: E0123 10:52:36.769873 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:36 crc kubenswrapper[4957]: E0123 10:52:36.769967 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.770001 4957 scope.go:117] "RemoveContainer" containerID="1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3" Jan 23 10:52:36 crc kubenswrapper[4957]: E0123 10:52:36.770231 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.858139 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.858216 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.858236 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.858264 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.858333 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.960871 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.960909 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.960919 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.960934 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:36 crc kubenswrapper[4957]: I0123 10:52:36.960945 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:36Z","lastTransitionTime":"2026-01-23T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.063079 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.063116 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.063127 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.063171 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.063185 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.165933 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.165985 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.166000 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.166020 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.166036 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.267997 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.268046 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.268059 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.268077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.268088 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.373173 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.373254 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.373264 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.373278 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.373316 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.475881 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.475918 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.475929 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.475945 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.475956 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.578298 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.578335 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.578348 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.578366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.578378 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.681609 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.681886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.681898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.681916 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.681928 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.754563 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 08:11:19.509446047 +0000 UTC Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.784503 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.784574 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.784597 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.784626 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.784648 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.886689 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.886751 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.886775 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.886805 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.886840 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.988889 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.988948 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.988967 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.988991 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:37 crc kubenswrapper[4957]: I0123 10:52:37.989008 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:37Z","lastTransitionTime":"2026-01-23T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.091967 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.092040 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.092056 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.092077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.092089 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.186117 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/0.log" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.186188 4957 generic.go:334] "Generic (PLEG): container finished" podID="233fdd78-4010-4fe8-9068-ee47d8ff25d1" containerID="d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088" exitCode=1 Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.186228 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerDied","Data":"d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.186797 4957 scope.go:117] "RemoveContainer" containerID="d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.197981 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.198029 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.198041 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.198058 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.198070 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.201918 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.221129 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.242400 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.255890 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.266413 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.277685 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.293048 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.300936 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.300993 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.301012 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.301036 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.301053 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.311192 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.330971 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.346507 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.357366 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.372212 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.387313 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.402989 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.403387 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.403668 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.403869 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.404045 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.408503 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.423436 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.443942 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.461720 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.478792 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:38Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.506927 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.506976 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.506989 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.507008 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.507019 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.609498 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.609541 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.609550 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.609563 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.609573 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.711563 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.711625 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.711646 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.711669 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.711688 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.755318 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 23:36:35.070902246 +0000 UTC Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.769652 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.769777 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.770022 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:38 crc kubenswrapper[4957]: E0123 10:52:38.770007 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.770032 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:38 crc kubenswrapper[4957]: E0123 10:52:38.770560 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:38 crc kubenswrapper[4957]: E0123 10:52:38.770671 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:38 crc kubenswrapper[4957]: E0123 10:52:38.770727 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.813887 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.813924 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.813935 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.813950 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.813960 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.916591 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.916657 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.916674 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.916696 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:38 crc kubenswrapper[4957]: I0123 10:52:38.916713 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:38Z","lastTransitionTime":"2026-01-23T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.019296 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.019328 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.019337 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.019365 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.019374 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.121951 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.122002 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.122011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.122024 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.122033 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.191863 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/0.log" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.191966 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerStarted","Data":"a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.212151 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.225047 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.225088 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.225099 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.225116 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.225130 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.227393 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.245847 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.259800 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.277153 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.288358 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.302923 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.318775 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.328133 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.328159 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.328168 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.328181 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.328189 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.333131 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.343325 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.353494 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.368569 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.381198 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.392969 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.401324 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.411397 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.431347 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.436413 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.436449 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.436461 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.436505 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.436517 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.445331 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:39Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.538731 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.538769 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.538778 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.538793 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.538802 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.640911 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.640966 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.640984 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.641007 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.641024 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.743600 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.743629 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.743637 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.743652 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.743660 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.756419 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 05:46:18.458360847 +0000 UTC Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.846544 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.846598 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.846621 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.846650 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.846670 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.949613 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.949656 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.949668 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.949685 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:39 crc kubenswrapper[4957]: I0123 10:52:39.949698 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:39Z","lastTransitionTime":"2026-01-23T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.051858 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.051894 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.051905 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.051919 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.051929 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.153618 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.153666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.153678 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.153694 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.153705 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.258688 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.258722 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.258732 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.258747 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.258757 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.361811 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.361876 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.361893 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.361916 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.361933 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.463886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.463918 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.463925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.463938 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.463947 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.566107 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.566187 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.566200 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.566216 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.566226 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.668439 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.668466 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.668473 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.668486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.668495 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.757517 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:25:54.137090793 +0000 UTC Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.769510 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.769632 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.769746 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:40 crc kubenswrapper[4957]: E0123 10:52:40.769886 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:40 crc kubenswrapper[4957]: E0123 10:52:40.770126 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:40 crc kubenswrapper[4957]: E0123 10:52:40.770191 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.770580 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.770698 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: E0123 10:52:40.770708 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.770737 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.770755 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.770775 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.770792 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.782333 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.794563 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.811868 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.824138 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.836596 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.847052 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.856879 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.868996 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.871980 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.872015 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.872027 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.872042 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.872052 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.888254 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.898594 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.906699 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.919325 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.931122 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.943159 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.958323 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.972734 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.974706 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.974746 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.974755 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.974773 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.974786 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:40Z","lastTransitionTime":"2026-01-23T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.985389 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:40 crc kubenswrapper[4957]: I0123 10:52:40.998439 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:40Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.077713 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.077752 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.077761 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.077776 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.077788 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.180059 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.180162 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.180177 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.180195 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.180527 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.282645 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.282704 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.282721 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.282748 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.282765 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.385808 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.385843 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.385852 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.385866 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.385875 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.488142 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.488183 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.488225 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.488260 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.488272 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.590938 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.590980 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.590990 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.591009 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.591020 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.693526 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.693572 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.693585 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.693604 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.693617 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.758675 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 22:54:22.044833609 +0000 UTC Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.780082 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.796245 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.796333 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.796369 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.796404 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.796459 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.899820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.899849 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.899859 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.899874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:41 crc kubenswrapper[4957]: I0123 10:52:41.899885 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:41Z","lastTransitionTime":"2026-01-23T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.003625 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.003664 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.003677 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.003697 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.003710 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.106305 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.106365 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.106376 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.106392 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.106402 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.208347 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.208390 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.208403 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.208420 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.208433 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.310660 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.310700 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.310710 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.310726 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.310738 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.413060 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.413110 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.413121 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.413143 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.413155 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.516733 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.516766 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.516777 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.516795 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.516807 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.619512 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.619612 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.619632 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.619666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.619683 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.722350 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.722394 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.722404 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.722425 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.722437 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.758875 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 04:01:32.421720674 +0000 UTC Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.769416 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.769460 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.769416 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.769522 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:42 crc kubenswrapper[4957]: E0123 10:52:42.769672 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:42 crc kubenswrapper[4957]: E0123 10:52:42.769767 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:42 crc kubenswrapper[4957]: E0123 10:52:42.769857 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:42 crc kubenswrapper[4957]: E0123 10:52:42.769903 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.825708 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.825763 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.825774 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.825794 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.825815 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.950231 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.950266 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.950290 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.950309 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:42 crc kubenswrapper[4957]: I0123 10:52:42.950320 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:42Z","lastTransitionTime":"2026-01-23T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.052522 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.052558 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.052570 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.052584 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.052597 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.155683 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.155755 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.155777 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.155808 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.155832 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.258834 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.258875 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.258884 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.258901 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.258911 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.358131 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.358208 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.358216 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.358231 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.358243 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: E0123 10:52:43.376378 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:43Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.380811 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.380886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.380907 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.380937 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.380957 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: E0123 10:52:43.393162 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:43Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.397359 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.397392 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.397421 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.397435 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.397444 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: E0123 10:52:43.410270 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:43Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.413982 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.414027 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.414038 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.414058 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.414070 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: E0123 10:52:43.425850 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:43Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.429703 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.429746 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.429755 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.429772 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.429784 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: E0123 10:52:43.440837 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:43Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:43 crc kubenswrapper[4957]: E0123 10:52:43.441008 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.442724 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.442750 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.442761 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.442778 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.442789 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.546411 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.546453 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.546469 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.546486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.546499 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.649784 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.649874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.649899 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.649932 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.649956 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.754049 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.754141 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.754158 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.754188 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.754212 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.759274 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 03:00:34.273490078 +0000 UTC Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.857585 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.857628 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.857641 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.857659 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.857671 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.960598 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.960638 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.960649 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.960666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:43 crc kubenswrapper[4957]: I0123 10:52:43.960677 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:43Z","lastTransitionTime":"2026-01-23T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.062735 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.062770 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.062778 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.062791 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.062800 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.166049 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.166121 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.166145 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.166175 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.166197 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.269509 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.269586 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.269604 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.269632 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.269650 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.373145 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.373210 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.373227 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.373251 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.373268 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.475645 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.475705 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.475717 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.475734 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.475745 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.577897 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.577963 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.577974 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.577990 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.578002 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.680463 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.680525 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.680537 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.680558 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.680568 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.759977 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 01:20:49.024071985 +0000 UTC Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.769383 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.769461 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.769384 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:44 crc kubenswrapper[4957]: E0123 10:52:44.769516 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.769549 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:44 crc kubenswrapper[4957]: E0123 10:52:44.769797 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:44 crc kubenswrapper[4957]: E0123 10:52:44.769915 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:44 crc kubenswrapper[4957]: E0123 10:52:44.769978 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.783201 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.783337 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.783364 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.783468 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.783527 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.886479 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.886528 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.886544 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.886570 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.886587 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.989069 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.989120 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.989137 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.989160 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:44 crc kubenswrapper[4957]: I0123 10:52:44.989179 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:44Z","lastTransitionTime":"2026-01-23T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.092273 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.092345 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.092362 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.092386 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.092402 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.195958 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.196020 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.196037 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.196063 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.196077 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.299153 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.299216 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.299239 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.299270 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.299344 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.402224 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.402300 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.402313 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.402332 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.402344 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.504899 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.504963 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.504984 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.505011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.505034 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.608272 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.608361 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.608378 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.608407 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.608425 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.711680 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.711756 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.711779 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.711812 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.711838 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.760861 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 07:31:22.015303118 +0000 UTC Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.814501 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.814569 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.814595 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.814622 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.814640 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.917954 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.918026 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.918048 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.918076 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:45 crc kubenswrapper[4957]: I0123 10:52:45.918098 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:45Z","lastTransitionTime":"2026-01-23T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.021310 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.021361 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.021378 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.021402 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.021419 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.124105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.124178 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.124190 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.124215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.124230 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.227406 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.227466 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.227484 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.227511 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.227532 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.331571 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.331619 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.331636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.331658 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.331677 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.435108 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.435171 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.435191 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.435215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.435232 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.538741 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.538788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.538807 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.538823 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.538834 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.640894 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.640932 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.640942 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.640958 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.640968 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.743793 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.743841 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.743851 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.743876 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.743888 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.761467 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 15:45:05.287919958 +0000 UTC Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.768881 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.768938 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.768899 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.768877 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:46 crc kubenswrapper[4957]: E0123 10:52:46.769014 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:46 crc kubenswrapper[4957]: E0123 10:52:46.769141 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:46 crc kubenswrapper[4957]: E0123 10:52:46.769388 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:46 crc kubenswrapper[4957]: E0123 10:52:46.769470 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.846693 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.846758 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.846768 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.846792 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.846805 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.949538 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.949604 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.949628 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.949650 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:46 crc kubenswrapper[4957]: I0123 10:52:46.949667 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:46Z","lastTransitionTime":"2026-01-23T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.052552 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.052608 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.052625 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.052653 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.052671 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.157546 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.157605 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.157621 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.157646 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.157665 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.261478 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.261577 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.261601 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.261635 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.261659 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.364506 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.365111 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.365338 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.365478 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.365611 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.468039 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.468687 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.468795 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.468925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.469009 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.572908 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.572979 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.572998 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.573027 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.573047 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.677727 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.677776 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.677788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.677805 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.677816 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.762532 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 05:31:33.46271389 +0000 UTC Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.781029 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.781074 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.781087 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.781112 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.781125 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.884108 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.884164 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.884180 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.884203 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.884220 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.988036 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.988099 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.988137 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.988180 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:47 crc kubenswrapper[4957]: I0123 10:52:47.988204 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:47Z","lastTransitionTime":"2026-01-23T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.091651 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.091722 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.091736 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.091760 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.091775 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.194768 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.194818 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.194830 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.194846 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.194857 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.298173 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.298243 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.298270 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.298339 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.298366 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.401501 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.401805 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.401986 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.402151 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.402317 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.511898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.512022 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.512105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.512133 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.512147 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.616635 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.616725 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.616744 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.616774 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.616794 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.719736 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.719825 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.719836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.719856 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.719868 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.763452 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 04:48:14.666277335 +0000 UTC Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.769901 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.770030 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.770107 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:48 crc kubenswrapper[4957]: E0123 10:52:48.770096 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.770271 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:48 crc kubenswrapper[4957]: E0123 10:52:48.770422 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:48 crc kubenswrapper[4957]: E0123 10:52:48.770594 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:48 crc kubenswrapper[4957]: E0123 10:52:48.771015 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.823461 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.823920 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.824047 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.824154 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.824309 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.927010 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.927907 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.927970 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.928002 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:48 crc kubenswrapper[4957]: I0123 10:52:48.928025 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:48Z","lastTransitionTime":"2026-01-23T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.031462 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.031540 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.031559 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.031591 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.031616 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.135186 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.135773 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.135924 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.136078 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.136206 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.240163 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.240215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.240234 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.240260 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.240309 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.343125 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.343193 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.343212 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.343273 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.343329 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.446106 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.446460 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.446596 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.446705 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.446823 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.550366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.551306 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.551476 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.551577 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.551658 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.655377 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.655918 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.656049 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.656160 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.656309 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.759524 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.759602 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.759620 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.759644 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.759662 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.763913 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 04:51:55.477895463 +0000 UTC Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.769972 4957 scope.go:117] "RemoveContainer" containerID="1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.862693 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.862761 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.862788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.862820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.862843 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.966636 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.967128 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.967261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.967430 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:49 crc kubenswrapper[4957]: I0123 10:52:49.967625 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:49Z","lastTransitionTime":"2026-01-23T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.070886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.071335 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.071447 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.071621 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.071738 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.174848 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.174911 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.174928 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.174953 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.174971 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.277990 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.278051 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.278068 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.278091 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.278109 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.381712 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.382088 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.382248 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.382471 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.382641 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.485398 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.485708 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.485921 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.486151 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.486388 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.589354 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.589404 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.589415 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.589436 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.589450 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.692237 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.692694 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.692839 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.692980 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.693102 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.764245 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 11:39:28.085454088 +0000 UTC Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.768914 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.768966 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:50 crc kubenswrapper[4957]: E0123 10:52:50.769123 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.769144 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:50 crc kubenswrapper[4957]: E0123 10:52:50.769295 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:50 crc kubenswrapper[4957]: E0123 10:52:50.769602 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.769452 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:50 crc kubenswrapper[4957]: E0123 10:52:50.769778 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.787208 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.795452 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.795503 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.795520 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.795545 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.795563 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.801930 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.814972 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.831817 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.854381 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.868891 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.898731 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.898789 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.898801 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.898823 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.898838 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:50Z","lastTransitionTime":"2026-01-23T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.900360 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.916857 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.932005 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.946690 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.962189 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f29c0b-284f-4acf-a7cf-c6dd5c1f7ddb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0689e778b4c0b920eda290d8614f287f3f85456eb0eb55c546fef72252cbba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.977356 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:50 crc kubenswrapper[4957]: I0123 10:52:50.989984 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:50Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.001200 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.001231 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.001239 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.001251 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.001260 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.005649 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.022810 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.037039 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.051478 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.064240 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.075419 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.103639 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.103670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.103678 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.103692 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.103709 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.206895 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.206938 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.206946 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.206962 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.206971 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.232878 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/2.log" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.235812 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.236325 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.249613 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.262465 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.274613 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.287019 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f29c0b-284f-4acf-a7cf-c6dd5c1f7ddb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0689e778b4c0b920eda290d8614f287f3f85456eb0eb55c546fef72252cbba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.300777 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.309791 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.309873 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.309888 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.309904 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.309916 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.316925 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.328708 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.342838 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.359623 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.376627 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.395403 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.409474 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.411569 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.411601 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.411610 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.411625 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.411635 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.419896 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.430395 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.444667 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.460256 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.480226 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.491081 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.501266 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:51Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.513540 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.513683 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.513757 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.513820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.513876 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.616969 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.617007 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.617017 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.617031 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.617041 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.719437 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.719712 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.719820 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.719929 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.720043 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.765325 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 22:34:27.968948593 +0000 UTC Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.822701 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.822739 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.822748 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.822762 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.822770 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.925999 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.926042 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.926054 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.926074 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:51 crc kubenswrapper[4957]: I0123 10:52:51.926090 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:51Z","lastTransitionTime":"2026-01-23T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.029163 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.029247 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.029262 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.029302 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.029317 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.132075 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.132127 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.132144 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.132168 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.132185 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.235221 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.235564 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.235588 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.235615 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.235633 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.245205 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/3.log" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.247910 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/2.log" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.252901 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" exitCode=1 Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.252959 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.253005 4957 scope.go:117] "RemoveContainer" containerID="1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.254388 4957 scope.go:117] "RemoveContainer" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.254743 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.270449 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.291637 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.310667 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.336459 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b61126eca35a8279d32f7b9386c382f26da91f6b28007d881463f438155d2e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:21Z\\\",\\\"message\\\":\\\"e configs for network=default: []services.lbConfig(nil)\\\\nI0123 10:52:21.654230 6620 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:21.654121 6620 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nF0123 10:52:21.652450 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:52Z\\\",\\\"message\\\":\\\"Map:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:51.767445 7016 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/cluster-autoscaler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:51.767543 7016 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0123 10:52:51.767599 7016 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.338661 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.338724 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.338735 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.338754 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.338766 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.352832 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.370944 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.388702 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.404271 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.437672 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.440880 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.440916 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.440928 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.440946 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.440958 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.456050 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.470018 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.486706 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.498773 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.510920 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.529703 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.543246 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.543314 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.543322 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.543337 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.543346 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.545718 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.559184 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f29c0b-284f-4acf-a7cf-c6dd5c1f7ddb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0689e778b4c0b920eda290d8614f287f3f85456eb0eb55c546fef72252cbba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.573824 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.586316 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:52Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.645807 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.645836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.645844 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.645858 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.645867 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.748673 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.748722 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.748733 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.748748 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.748760 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.766226 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 01:12:53.449097665 +0000 UTC Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.769639 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.769696 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.769667 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.769772 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.769640 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.770094 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.770110 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.770151 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.796413 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.796554 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.796592 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796615 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.796591387 +0000 UTC m=+146.333844074 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.796660 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796696 4957 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.796704 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796717 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796740 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796749 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.796736842 +0000 UTC m=+146.333989589 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796754 4957 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796805 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796818 4957 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796829 4957 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796807 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.796791343 +0000 UTC m=+146.334044040 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796891 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.796870925 +0000 UTC m=+146.334123632 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.796993 4957 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: E0123 10:52:52.797208 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.797156962 +0000 UTC m=+146.334409689 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.851772 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.851877 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.851908 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.851955 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.851996 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.954972 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.955053 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.955077 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.955109 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:52 crc kubenswrapper[4957]: I0123 10:52:52.955139 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:52Z","lastTransitionTime":"2026-01-23T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.058122 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.058186 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.058202 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.058224 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.058239 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.160529 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.160572 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.160583 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.160599 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.160610 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.256789 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/3.log" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.259791 4957 scope.go:117] "RemoveContainer" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" Jan 23 10:52:53 crc kubenswrapper[4957]: E0123 10:52:53.260050 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.263322 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.263350 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.263386 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.263402 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.263416 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.272113 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.285984 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.298809 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.315441 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.366486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.366532 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.366542 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.366564 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.366577 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.375613 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:52Z\\\",\\\"message\\\":\\\"Map:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:51.767445 7016 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/cluster-autoscaler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:51.767543 7016 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0123 10:52:51.767599 7016 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.397941 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.413634 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.427684 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.444676 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.469505 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.469549 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.469559 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.469575 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.469586 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.478219 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.494539 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.505961 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.516773 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.529551 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.540728 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.560436 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.572265 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.572310 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.572320 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.572333 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.572344 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.574946 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.586960 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f29c0b-284f-4acf-a7cf-c6dd5c1f7ddb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0689e778b4c0b920eda290d8614f287f3f85456eb0eb55c546fef72252cbba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.601491 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.674375 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.674414 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.674424 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.674440 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.674451 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.767131 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 17:11:44.540980326 +0000 UTC Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.776882 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.776933 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.776951 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.776972 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.776988 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.800377 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.800430 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.800438 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.800453 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.800462 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: E0123 10:52:53.815841 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.820785 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.820827 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.820839 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.820854 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.820865 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: E0123 10:52:53.833510 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.837921 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.837962 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.837971 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.837986 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.837998 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: E0123 10:52:53.856684 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.861368 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.861434 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.861458 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.861489 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.861512 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: E0123 10:52:53.878229 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.883223 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.883331 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.883358 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.883389 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.883407 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:53 crc kubenswrapper[4957]: E0123 10:52:53.902652 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:52:53Z is after 2025-08-24T17:21:41Z" Jan 23 10:52:53 crc kubenswrapper[4957]: E0123 10:52:53.902893 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.904786 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.904848 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.904872 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.904898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:53 crc kubenswrapper[4957]: I0123 10:52:53.904916 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:53Z","lastTransitionTime":"2026-01-23T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.007837 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.007912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.007955 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.007988 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.008008 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.110575 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.110635 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.110651 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.110676 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.110695 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.214046 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.214132 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.214158 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.214186 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.214210 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.317154 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.317187 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.317197 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.317215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.317226 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.419741 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.419819 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.419846 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.419880 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.419902 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.523815 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.523875 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.523891 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.523916 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.523935 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.626327 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.626377 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.626392 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.626413 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.626428 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.729014 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.729091 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.729116 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.729148 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.729174 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.767558 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 20:39:24.111806744 +0000 UTC Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.768776 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.768875 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:54 crc kubenswrapper[4957]: E0123 10:52:54.769008 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.769052 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.769023 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:54 crc kubenswrapper[4957]: E0123 10:52:54.769124 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:54 crc kubenswrapper[4957]: E0123 10:52:54.769185 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:54 crc kubenswrapper[4957]: E0123 10:52:54.769253 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.831953 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.831992 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.832002 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.832020 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.832032 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.935134 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.935224 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.935240 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.935266 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:54 crc kubenswrapper[4957]: I0123 10:52:54.935300 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:54Z","lastTransitionTime":"2026-01-23T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.038471 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.038509 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.038521 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.038536 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.038545 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.140390 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.140434 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.140447 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.140463 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.140475 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.243468 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.243509 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.243518 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.243531 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.243539 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.346714 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.346786 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.346809 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.346838 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.346861 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.449181 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.449241 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.449257 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.449306 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.449320 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.552130 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.552189 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.552212 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.552258 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.552311 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.655231 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.655268 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.655294 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.655311 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.655322 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.758102 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.758179 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.758209 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.758238 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.758258 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.768356 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:18:00.203121008 +0000 UTC Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.860877 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.860918 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.860934 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.860957 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.860974 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.963449 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.963519 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.963543 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.963574 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:55 crc kubenswrapper[4957]: I0123 10:52:55.963599 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:55Z","lastTransitionTime":"2026-01-23T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.066670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.066731 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.066747 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.066771 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.066788 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.169476 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.169549 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.169573 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.169602 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.169624 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.272337 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.272395 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.272404 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.272421 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.272431 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.375032 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.375101 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.375125 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.375153 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.375170 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.477388 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.477453 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.477470 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.477500 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.477523 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.580777 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.580829 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.580846 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.580867 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.580881 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.683680 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.683723 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.683735 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.683750 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.683763 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.768520 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 12:50:45.993549425 +0000 UTC Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.768893 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:56 crc kubenswrapper[4957]: E0123 10:52:56.769001 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.769049 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.769146 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.769198 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:56 crc kubenswrapper[4957]: E0123 10:52:56.769307 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:56 crc kubenswrapper[4957]: E0123 10:52:56.769437 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:56 crc kubenswrapper[4957]: E0123 10:52:56.769574 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.786948 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.787000 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.787011 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.787030 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.787041 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.889826 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.889869 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.889896 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.889912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.889922 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.998700 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.998809 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.998836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.998868 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:56 crc kubenswrapper[4957]: I0123 10:52:56.998893 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:56Z","lastTransitionTime":"2026-01-23T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.102125 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.102225 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.102307 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.102333 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.102351 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.205662 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.205716 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.205737 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.205762 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.205781 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.309500 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.309561 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.309580 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.309606 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.309625 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.413468 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.413577 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.413604 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.413637 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.413661 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.516208 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.516366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.516433 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.516458 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.516518 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.619722 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.619782 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.619797 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.619818 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.619833 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.722233 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.722330 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.722349 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.722375 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.722395 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.769258 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 03:04:45.409180961 +0000 UTC Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.825569 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.825628 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.825644 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.825665 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.825680 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.929232 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.929670 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.929838 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.930024 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:57 crc kubenswrapper[4957]: I0123 10:52:57.930172 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:57Z","lastTransitionTime":"2026-01-23T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.033433 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.033460 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.033468 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.033480 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.033489 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.135910 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.135947 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.135955 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.135969 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.135978 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.239319 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.239690 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.239902 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.240111 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.240380 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.344265 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.344390 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.344408 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.344434 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.344452 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.447680 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.447737 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.447753 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.447778 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.447800 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.555117 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.555191 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.555209 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.555243 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.555264 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.658309 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.658363 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.658381 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.658403 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.658419 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.760459 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.760507 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.760516 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.760529 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.760538 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.769680 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 07:51:24.520701234 +0000 UTC Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.769865 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.769873 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.769997 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:52:58 crc kubenswrapper[4957]: E0123 10:52:58.770001 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.770040 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:52:58 crc kubenswrapper[4957]: E0123 10:52:58.770144 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:52:58 crc kubenswrapper[4957]: E0123 10:52:58.770326 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:52:58 crc kubenswrapper[4957]: E0123 10:52:58.770441 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.862324 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.862356 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.862368 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.862385 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.862396 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.965042 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.965088 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.965104 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.965121 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:58 crc kubenswrapper[4957]: I0123 10:52:58.965131 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:58Z","lastTransitionTime":"2026-01-23T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.067838 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.067898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.067915 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.067938 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.067959 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.170953 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.171322 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.171466 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.171654 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.171789 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.275395 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.275471 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.275489 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.275513 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.275530 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.378679 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.378783 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.378809 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.378839 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.378862 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.481356 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.481428 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.481445 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.481468 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.481486 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.583996 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.584493 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.584540 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.584565 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.584582 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.688087 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.688147 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.688161 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.688178 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.688189 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.769815 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:22:22.50497568 +0000 UTC Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.791189 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.791240 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.791249 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.791264 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.791351 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.894366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.894437 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.894458 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.894510 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.894534 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.997795 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.997867 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.997887 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.997919 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:52:59 crc kubenswrapper[4957]: I0123 10:52:59.997939 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:52:59Z","lastTransitionTime":"2026-01-23T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.100973 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.101112 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.101131 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.101161 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.101183 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.205171 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.205225 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.205241 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.205351 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.205366 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.308657 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.308709 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.308726 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.308748 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.308765 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.411188 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.411238 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.411249 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.411495 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.411512 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.514161 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.514213 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.514242 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.514332 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.514359 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.617776 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.617833 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.617847 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.617868 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.617883 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.721481 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.721560 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.721583 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.721614 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.721639 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.769119 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:00 crc kubenswrapper[4957]: E0123 10:53:00.769440 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.769571 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.769669 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.769788 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:00 crc kubenswrapper[4957]: E0123 10:53:00.769801 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:00 crc kubenswrapper[4957]: E0123 10:53:00.770016 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.770084 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 21:39:15.195255729 +0000 UTC Jan 23 10:53:00 crc kubenswrapper[4957]: E0123 10:53:00.770184 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.799181 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8631604-63ce-40b0-b27e-fba17f940f20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://523b9a208f414955faffe254957d3bb6d287eab26ea653e23c9bcc2c3182d5cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80645d17b02b24a907b20d376fcb65a794768d4c9cf07550bff63d50a011836d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3fcfc9fcf5f37f32b4a654710f3f0f5c3fab5b0b5c35239e5f1a2789d1ec480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48e3593322a778bf8d56e3509d97a341f1fee5e172f8ba4bbc4c1dacefb3930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193091ca5d5fb974b1e2da289e7fbc6e2d3a292e79d1936c7ba10266a5ba9779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7feaf0932d8842db7af2f28984a811c16674b50ba28a4627829ce8471543daa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f9a1676bfe308ad9883454af09a802df678db77e0e246de3cc7aea91410a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78b38484db8444b196494249f32bd63f2478d80e89ff823c8ac7c974b1ea6e76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.818957 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b07c6571fe0e39bd6607feb900919a481ef8a36483c9b4de1c6d5ea3453ba61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.825140 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.825207 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.825230 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.825261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.825319 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.838370 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnxz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c7b1449-2e9b-4c07-a531-591cb968f511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6410b47b5b38b4ce50175e8cd9c2cc7ca241b914d1dba4accf3a1deb3e066ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj687\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnxz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.854762 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f29c0b-284f-4acf-a7cf-c6dd5c1f7ddb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0689e778b4c0b920eda290d8614f287f3f85456eb0eb55c546fef72252cbba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e50964685dfb51b71c9b29730e05b31cdfe4bf9f81bfc1313b8c0074f615af0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.871993 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea507738-b425-4366-808b-3a47317e66d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"le observer\\\\nW0123 10:51:48.273886 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0123 10:51:48.273997 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0123 10:51:48.275269 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3502356273/tls.crt::/tmp/serving-cert-3502356273/tls.key\\\\\\\"\\\\nI0123 10:51:48.537137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0123 10:51:48.548481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0123 10:51:48.548515 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0123 10:51:48.548544 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0123 10:51:48.548577 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0123 10:51:48.561057 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0123 10:51:48.561112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561123 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0123 10:51:48.561139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0123 10:51:48.561151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0123 10:51:48.561158 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0123 10:51:48.561167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0123 10:51:48.561209 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0123 10:51:48.561756 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.890043 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d5bf3b-7b38-431d-abdd-266da7d33d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea2d00e0b56b1da716e6188c5d0c1cbb52bcdf2a9483168aabc8f3bc408b7b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e85d135cfb611a98270feed36b0cb6f5992ca1432d5d1af0a62465e71ddd0244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ec9dd4bf806f7dd14d9e8b14fb6ccd83a8f5b1226a4cb365a946f4c6f8adad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://646a6c52a5b43295fea795b62f3903a351b07e95dc45af842bbe3c3218e143ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.907364 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.930085 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f19c40d295c11e3a1170d61fd738b1dcd8fb10087f6a1bb74e6e6c8e6cfb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6fcda9eaf99f7b60db85da6f18a98ccca7b5bc532aa28388fc7845caf1a7356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.933562 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.933666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.933695 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.933721 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.933737 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:00Z","lastTransitionTime":"2026-01-23T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.942967 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.980482 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11d94cd0-1619-4ef6-952a-aef84e1cdc75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdb19fbdef461009ebd78d9089ba9c94908e4c9fbcab108320e0d89c7f30547f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d28aff8ae7d4adbef5753dc21af0f1055f19ed4404bf9af31f34be215120555\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90e2094361e033ef7c8226913e9e396e6a2036617edb9c5ad0ed00c2c5f9f707\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e945a247fdc5833508edb04c5130a406d137e3b0ab1be12f9bf954f2621c3bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26ffeff282bb20d68e9f648999f28b818a67b74f062714fd26ad22e70591a74c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bbe74c2c27b97766cfa5d26987cd0fbd8289871e66eab78c54a52c0aa039728\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ee5b55e77324735662dd6bc0fdeee86af454eb4b9e8eb9e877119f7c1395ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd8wk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6cq2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:00 crc kubenswrapper[4957]: I0123 10:53:00.991253 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87775b38-0664-48f6-8857-7568c135bd79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrj7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5fxfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:00Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.002819 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://512cd439903792d034cd6017d149d8f3e9e24ffbfc36964572fc9419d54c3513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.013417 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.024135 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb53662e-fe72-4c19-b3a6-f5b541e5afcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://829cfdb541d2a7861316957f39b8b9f43ec6f9f4e309a491f4451b1f3c34a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90926087d1bb350c991fa9425706fcc22e12eec003aba87b72758892aae9d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8144556693b41dc2f9121be49ceed161caf8db5eec797f086128a2016be8072\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.034857 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg9hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a6ddd9-627a-4faa-a4c4-096ea19af31d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f469de9a3c43ade33ae855757f1244dcd825827dea9633af7143c078b08d6d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wngq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg9hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.036572 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.036597 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.036607 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.036621 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.036633 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.045601 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"224e3211-1f68-4673-8975-7e71b1e513d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd046581d049e9ca0071a010da143a9b28d271b533b9cdc1c94d19311be0320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s26mf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2xjv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.065092 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlz2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"233fdd78-4010-4fe8-9068-ee47d8ff25d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:37Z\\\",\\\"message\\\":\\\"2026-01-23T10:51:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd\\\\n2026-01-23T10:51:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc2190a4-eddd-4623-837e-d09cf8000fdd to /host/opt/cni/bin/\\\\n2026-01-23T10:51:52Z [verbose] multus-daemon started\\\\n2026-01-23T10:51:52Z [verbose] Readiness Indicator file check\\\\n2026-01-23T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpwrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlz2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.092800 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87adc28a-89e3-4743-a9f2-098d4a9432d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-23T10:52:52Z\\\",\\\"message\\\":\\\"Map:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:51.767445 7016 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/cluster-autoscaler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0123 10:52:51.767543 7016 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0123 10:52:51.767599 7016 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-23T10:52:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-23T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-23T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8hcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.112447 4957 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"340bb9e5-0a20-4377-acf7-aba4b7788153\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-23T10:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac26363f220465a8681578f7a7b90cf7d0abf8676379a1e963f4998646327c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19e8778b10a2bb2a713374cb69e07a76edf2371f7a130a691e759a03c0322251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-23T10:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55l78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-23T10:52:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9rkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:01Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.138575 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.138644 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.138662 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.138723 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.138742 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.242255 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.242358 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.242380 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.242409 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.242427 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.345730 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.345768 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.345777 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.345791 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.345799 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.449857 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.450046 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.450088 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.450205 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.450346 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.553807 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.553867 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.553876 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.553891 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.553901 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.657575 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.657649 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.657674 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.657702 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.657725 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.760168 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.760249 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.760266 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.760333 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.760353 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.771092 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 11:31:24.411461317 +0000 UTC Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.862945 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.862992 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.863003 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.863019 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.863033 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.965412 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.965444 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.965452 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.965467 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:01 crc kubenswrapper[4957]: I0123 10:53:01.965477 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:01Z","lastTransitionTime":"2026-01-23T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.068084 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.068132 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.068143 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.068161 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.068172 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.171399 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.171444 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.171459 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.171477 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.171491 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.274163 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.274205 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.274215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.274231 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.274240 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.377591 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.377639 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.377657 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.377675 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.377686 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.480966 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.481018 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.481027 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.481041 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.481052 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.584083 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.584133 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.584149 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.584168 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.584181 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.686822 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.686886 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.686902 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.686931 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.686973 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.769742 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.769871 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.769783 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:02 crc kubenswrapper[4957]: E0123 10:53:02.769993 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.770025 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:02 crc kubenswrapper[4957]: E0123 10:53:02.770081 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:02 crc kubenswrapper[4957]: E0123 10:53:02.770209 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:02 crc kubenswrapper[4957]: E0123 10:53:02.770381 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.771697 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 19:19:18.478106721 +0000 UTC Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.789634 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.789690 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.789707 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.789728 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.789746 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.892486 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.892530 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.892544 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.892564 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.892577 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.998735 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.998852 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.998898 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.998928 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:02 crc kubenswrapper[4957]: I0123 10:53:02.998955 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:02Z","lastTransitionTime":"2026-01-23T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.101495 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.101566 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.101583 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.101608 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.101647 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.204977 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.205053 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.205104 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.205130 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.205146 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.307798 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.307849 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.307865 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.307892 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.307910 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.410801 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.410857 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.410875 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.410899 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.410916 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.514093 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.514148 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.514160 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.514179 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.514191 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.618024 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.618070 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.618083 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.618101 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.618113 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.721343 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.721524 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.721590 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.721658 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.721684 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.772444 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:30:51.931496012 +0000 UTC Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.825383 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.825459 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.825481 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.825512 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.825533 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.928778 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.928845 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.928863 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.928888 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:03 crc kubenswrapper[4957]: I0123 10:53:03.928906 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:03Z","lastTransitionTime":"2026-01-23T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.032560 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.032623 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.032641 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.032666 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.032684 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.073715 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.073763 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.073772 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.073788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.073803 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.087444 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.090988 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.091022 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.091032 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.091047 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.091058 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.103841 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.107151 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.107176 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.107184 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.107197 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.107206 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.119730 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.123369 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.123408 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.123419 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.123433 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.123442 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.138315 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.141801 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.141836 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.141845 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.141878 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.141895 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.152669 4957 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-23T10:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"608e000a-3057-4f1e-b4ab-15bf3bfea3b8\\\",\\\"systemUUID\\\":\\\"4219e85c-09d5-42d3-a5cb-7c9fe3da136f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-23T10:53:04Z is after 2025-08-24T17:21:41Z" Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.152805 4957 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.154202 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.154261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.154291 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.154316 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.154330 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.256467 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.256523 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.256541 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.256566 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.256583 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.359840 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.359907 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.359918 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.359945 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.359957 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.462864 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.462912 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.462924 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.462942 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.462952 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.566241 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.566331 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.566341 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.566356 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.566365 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.668839 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.668900 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.668914 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.668931 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.668941 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.769160 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.769372 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.769577 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.769914 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.770393 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.770578 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.770710 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.770726 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.771016 4957 scope.go:117] "RemoveContainer" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" Jan 23 10:53:04 crc kubenswrapper[4957]: E0123 10:53:04.771238 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.771312 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.771376 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.771394 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.771420 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.771438 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.772560 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:47:58.9043243 +0000 UTC Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.874054 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.875050 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.875230 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.875610 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.875808 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.979199 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.979260 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.979302 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.979328 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:04 crc kubenswrapper[4957]: I0123 10:53:04.979345 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:04Z","lastTransitionTime":"2026-01-23T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.082540 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.082613 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.082634 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.082660 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.082678 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.185710 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.185810 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.185848 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.185880 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.185902 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.289644 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.289721 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.289759 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.289792 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.289815 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.392730 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.392785 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.392802 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.392827 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.392845 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.495809 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.495883 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.495906 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.495935 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.495970 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.599773 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.599853 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.599876 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.599911 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.599929 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.703430 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.703493 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.703509 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.703532 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.703552 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.773387 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 00:49:23.358819541 +0000 UTC Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.805370 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.805407 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.805418 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.805433 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.805443 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.908942 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.909023 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.909048 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.909079 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:05 crc kubenswrapper[4957]: I0123 10:53:05.909101 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:05Z","lastTransitionTime":"2026-01-23T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.011667 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.011733 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.011749 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.011772 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.011789 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.115142 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.115182 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.115191 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.115209 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.115219 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.218785 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.218849 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.218874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.218904 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.218925 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.321355 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.321418 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.321440 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.321468 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.321487 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.424610 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.424650 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.424658 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.424674 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.424684 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.529053 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.529095 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.529105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.529123 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.529135 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.632418 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.632477 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.632488 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.632505 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.632516 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.735355 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.735439 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.735459 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.735489 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.735508 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.769100 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.769135 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.769224 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:06 crc kubenswrapper[4957]: E0123 10:53:06.769233 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.769406 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:06 crc kubenswrapper[4957]: E0123 10:53:06.769532 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:06 crc kubenswrapper[4957]: E0123 10:53:06.769872 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:06 crc kubenswrapper[4957]: E0123 10:53:06.770029 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.773542 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 17:31:43.216832109 +0000 UTC Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.838024 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.838097 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.838114 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.838138 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.838156 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.940531 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.940596 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.940615 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.940641 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:06 crc kubenswrapper[4957]: I0123 10:53:06.940659 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:06Z","lastTransitionTime":"2026-01-23T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.044812 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.044874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.044891 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.044918 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.044938 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.148954 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.149014 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.149032 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.149056 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.149074 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.252028 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.252064 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.252072 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.252086 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.252096 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.354098 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.354138 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.354150 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.354167 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.354178 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.456908 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.457008 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.457025 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.457050 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.457064 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.559895 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.559965 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.559984 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.560010 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.560027 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.662880 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.663642 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.663903 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.664075 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.664238 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.767241 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.767269 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.767292 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.767304 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.767313 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.774094 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 05:55:50.851600567 +0000 UTC Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.870479 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.870524 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.870536 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.870553 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.870568 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.972920 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.972993 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.973020 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.973051 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:07 crc kubenswrapper[4957]: I0123 10:53:07.973073 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:07Z","lastTransitionTime":"2026-01-23T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.075813 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.075875 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.075901 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.075930 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.075953 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.178612 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.178672 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.178690 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.178719 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.178736 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.281027 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.281102 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.281125 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.281150 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.281169 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.384413 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.384601 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.384630 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.384658 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.384679 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.465239 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:08 crc kubenswrapper[4957]: E0123 10:53:08.465431 4957 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:53:08 crc kubenswrapper[4957]: E0123 10:53:08.465501 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs podName:87775b38-0664-48f6-8857-7568c135bd79 nodeName:}" failed. No retries permitted until 2026-01-23 10:54:12.465482574 +0000 UTC m=+162.002735271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs") pod "network-metrics-daemon-5fxfb" (UID: "87775b38-0664-48f6-8857-7568c135bd79") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.486987 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.487036 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.487051 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.487071 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.487084 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.590326 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.590389 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.590411 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.590440 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.590461 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.692854 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.692929 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.692963 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.692996 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.693018 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.768978 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.769050 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.768987 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:08 crc kubenswrapper[4957]: E0123 10:53:08.769173 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.769597 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:08 crc kubenswrapper[4957]: E0123 10:53:08.769732 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:08 crc kubenswrapper[4957]: E0123 10:53:08.769837 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:08 crc kubenswrapper[4957]: E0123 10:53:08.769957 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.775178 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:16:33.273938245 +0000 UTC Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.796413 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.796641 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.796797 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.796941 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.797140 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.899895 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.899929 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.899937 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.899950 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:08 crc kubenswrapper[4957]: I0123 10:53:08.899958 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:08Z","lastTransitionTime":"2026-01-23T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.002323 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.002379 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.002396 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.002416 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.002429 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.104625 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.104673 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.104686 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.104705 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.104719 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.207043 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.207354 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.207557 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.207678 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.207802 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.311775 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.312079 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.312173 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.312241 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.312323 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.415315 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.415644 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.415761 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.415874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.415985 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.518830 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.519105 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.519372 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.519594 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.519798 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.623218 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.623606 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.623772 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.623947 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.624082 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.726654 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.726974 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.727115 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.727253 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.727436 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.776167 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 08:59:31.24271137 +0000 UTC Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.831243 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.831375 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.831396 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.831427 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.831446 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.934193 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.934245 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.934264 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.934325 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:09 crc kubenswrapper[4957]: I0123 10:53:09.934344 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:09Z","lastTransitionTime":"2026-01-23T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.037919 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.037977 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.037994 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.038018 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.038036 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.142334 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.142408 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.142426 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.142449 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.142467 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.246122 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.246174 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.246191 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.246215 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.246232 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.349225 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.349333 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.349361 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.349392 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.349417 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.452653 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.452702 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.452719 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.452746 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.452765 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.555649 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.555706 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.555728 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.555756 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.555773 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.659374 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.659464 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.659481 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.659506 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.659523 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.762542 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.762583 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.762598 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.762617 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.762630 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.769029 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.769161 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:10 crc kubenswrapper[4957]: E0123 10:53:10.769325 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.769443 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.769439 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:10 crc kubenswrapper[4957]: E0123 10:53:10.769605 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:10 crc kubenswrapper[4957]: E0123 10:53:10.769782 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:10 crc kubenswrapper[4957]: E0123 10:53:10.770001 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.776460 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 07:27:52.976683987 +0000 UTC Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.823744 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=81.823719519 podStartE2EDuration="1m21.823719519s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:10.822778324 +0000 UTC m=+100.360031051" watchObservedRunningTime="2026-01-23 10:53:10.823719519 +0000 UTC m=+100.360972236" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.865142 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.865204 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.865223 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.865249 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.865302 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.876696 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fnxz6" podStartSLOduration=81.876665728 podStartE2EDuration="1m21.876665728s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:10.858754342 +0000 UTC m=+100.396007069" watchObservedRunningTime="2026-01-23 10:53:10.876665728 +0000 UTC m=+100.413918455" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.917848 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.917817231 podStartE2EDuration="1m22.917817231s" podCreationTimestamp="2026-01-23 10:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:10.917313358 +0000 UTC m=+100.454566105" watchObservedRunningTime="2026-01-23 10:53:10.917817231 +0000 UTC m=+100.455069958" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.918774 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.918754736 podStartE2EDuration="29.918754736s" podCreationTimestamp="2026-01-23 10:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:10.891677449 +0000 UTC m=+100.428930206" watchObservedRunningTime="2026-01-23 10:53:10.918754736 +0000 UTC m=+100.456007463" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.963550 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.963526633 podStartE2EDuration="46.963526633s" podCreationTimestamp="2026-01-23 10:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:10.942475733 +0000 UTC m=+100.479728490" watchObservedRunningTime="2026-01-23 10:53:10.963526633 +0000 UTC m=+100.500779330" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.967884 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.967935 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.967956 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.967978 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:10 crc kubenswrapper[4957]: I0123 10:53:10.967994 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:10Z","lastTransitionTime":"2026-01-23T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.053105 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6cq2v" podStartSLOduration=82.053083406 podStartE2EDuration="1m22.053083406s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:11.032081848 +0000 UTC m=+100.569334545" watchObservedRunningTime="2026-01-23 10:53:11.053083406 +0000 UTC m=+100.590336103" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.070507 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.070586 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.070603 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.070623 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.070637 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.094515 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=83.094494776 podStartE2EDuration="1m23.094494776s" podCreationTimestamp="2026-01-23 10:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:11.093014537 +0000 UTC m=+100.630267234" watchObservedRunningTime="2026-01-23 10:53:11.094494776 +0000 UTC m=+100.631747473" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.106817 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rg9hb" podStartSLOduration=82.106795946 podStartE2EDuration="1m22.106795946s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:11.106089347 +0000 UTC m=+100.643342044" watchObservedRunningTime="2026-01-23 10:53:11.106795946 +0000 UTC m=+100.644048663" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.120741 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podStartSLOduration=82.120720989 podStartE2EDuration="1m22.120720989s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:11.120230026 +0000 UTC m=+100.657482723" watchObservedRunningTime="2026-01-23 10:53:11.120720989 +0000 UTC m=+100.657973686" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.138167 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tlz2g" podStartSLOduration=82.138148213 podStartE2EDuration="1m22.138148213s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:11.137201739 +0000 UTC m=+100.674454436" watchObservedRunningTime="2026-01-23 10:53:11.138148213 +0000 UTC m=+100.675400910" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.172419 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.172478 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.172495 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.172520 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.172537 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.182665 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9rkq" podStartSLOduration=81.182647953 podStartE2EDuration="1m21.182647953s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:11.18178677 +0000 UTC m=+100.719039497" watchObservedRunningTime="2026-01-23 10:53:11.182647953 +0000 UTC m=+100.719900650" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.275611 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.275669 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.275686 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.275713 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.275732 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.378092 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.378487 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.378642 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.378781 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.378920 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.482026 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.482108 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.482124 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.482143 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.482155 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.585332 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.585368 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.585375 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.585390 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.585401 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.687751 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.687810 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.687827 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.687852 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.687869 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.777363 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 22:01:01.935722066 +0000 UTC Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.790601 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.790669 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.790688 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.790713 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.790734 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.893261 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.893366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.893387 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.893415 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.893433 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.996299 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.996370 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.996382 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.996402 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:11 crc kubenswrapper[4957]: I0123 10:53:11.996415 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:11Z","lastTransitionTime":"2026-01-23T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.099221 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.099327 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.099347 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.099376 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.099396 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.202267 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.202343 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.202354 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.202370 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.202381 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.305335 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.305411 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.305424 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.305448 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.305475 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.408960 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.409019 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.409036 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.409094 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.409118 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.511613 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.511683 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.511700 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.511726 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.511742 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.614223 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.614714 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.614736 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.614762 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.614779 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.721449 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.721529 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.721551 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.721576 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.721608 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.769234 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.769234 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.769435 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.769534 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:12 crc kubenswrapper[4957]: E0123 10:53:12.769529 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:12 crc kubenswrapper[4957]: E0123 10:53:12.769727 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:12 crc kubenswrapper[4957]: E0123 10:53:12.769861 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:12 crc kubenswrapper[4957]: E0123 10:53:12.769978 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.778015 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 22:20:23.378132841 +0000 UTC Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.824830 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.824874 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.824885 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.824902 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.824913 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.927489 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.927520 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.927545 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.927558 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:12 crc kubenswrapper[4957]: I0123 10:53:12.927566 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:12Z","lastTransitionTime":"2026-01-23T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.030638 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.030692 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.030710 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.030732 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.030747 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.134542 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.134648 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.134673 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.134707 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.134732 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.237642 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.237730 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.237741 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.237757 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.237770 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.340734 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.340798 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.340817 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.340842 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.340859 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.444336 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.444415 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.444442 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.444472 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.444498 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.547272 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.547354 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.547373 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.547398 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.547415 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.650959 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.651018 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.651035 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.651059 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.651076 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.754004 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.754118 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.754138 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.754164 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.754182 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.778672 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 01:53:44.624953167 +0000 UTC Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.857144 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.857205 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.857226 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.857253 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.857311 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.960788 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.960851 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.960870 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.960894 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:13 crc kubenswrapper[4957]: I0123 10:53:13.960912 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:13Z","lastTransitionTime":"2026-01-23T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.063371 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.063433 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.063455 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.063484 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.063501 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:14Z","lastTransitionTime":"2026-01-23T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.166843 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.166908 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.166925 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.166961 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.166980 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:14Z","lastTransitionTime":"2026-01-23T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.262436 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.262527 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.262537 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.262554 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.262563 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:14Z","lastTransitionTime":"2026-01-23T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.284316 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.284355 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.284366 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.284382 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.284392 4957 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-23T10:53:14Z","lastTransitionTime":"2026-01-23T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.323457 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv"] Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.323773 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.327199 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.327774 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.328090 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.328751 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.338274 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.338355 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.338406 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.338451 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.338487 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-service-ca\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.439870 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.439979 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.440059 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.440103 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.440172 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-service-ca\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.440194 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.440369 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.441103 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-service-ca\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.446401 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.460314 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63e68ee2-5d7b-4a86-b6e4-a7b68c133dde-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-n7bvv\" (UID: \"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.648505 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.769547 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.769589 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.769611 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.769547 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:14 crc kubenswrapper[4957]: E0123 10:53:14.769695 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:14 crc kubenswrapper[4957]: E0123 10:53:14.769798 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:14 crc kubenswrapper[4957]: E0123 10:53:14.769886 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:14 crc kubenswrapper[4957]: E0123 10:53:14.770001 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.778957 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 16:18:06.7900627 +0000 UTC Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.779015 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 23 10:53:14 crc kubenswrapper[4957]: I0123 10:53:14.788488 4957 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 23 10:53:15 crc kubenswrapper[4957]: I0123 10:53:15.333327 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" event={"ID":"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde","Type":"ContainerStarted","Data":"f8465afe634d35ab46cec3d7461b732cec2c07c990ac22b4daccda39bceec9e9"} Jan 23 10:53:15 crc kubenswrapper[4957]: I0123 10:53:15.333400 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" event={"ID":"63e68ee2-5d7b-4a86-b6e4-a7b68c133dde","Type":"ContainerStarted","Data":"f5db16a6354248da09d7f82ae6777b824ace96f32b76f034ad412445a2fdbf69"} Jan 23 10:53:16 crc kubenswrapper[4957]: I0123 10:53:16.769823 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:16 crc kubenswrapper[4957]: I0123 10:53:16.769944 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:16 crc kubenswrapper[4957]: E0123 10:53:16.770044 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:16 crc kubenswrapper[4957]: I0123 10:53:16.770062 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:16 crc kubenswrapper[4957]: I0123 10:53:16.770136 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:16 crc kubenswrapper[4957]: E0123 10:53:16.770209 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:16 crc kubenswrapper[4957]: E0123 10:53:16.770322 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:16 crc kubenswrapper[4957]: E0123 10:53:16.770427 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:18 crc kubenswrapper[4957]: I0123 10:53:18.770022 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:18 crc kubenswrapper[4957]: E0123 10:53:18.770240 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:18 crc kubenswrapper[4957]: I0123 10:53:18.770526 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:18 crc kubenswrapper[4957]: I0123 10:53:18.770631 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:18 crc kubenswrapper[4957]: E0123 10:53:18.770753 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:18 crc kubenswrapper[4957]: I0123 10:53:18.770802 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:18 crc kubenswrapper[4957]: E0123 10:53:18.770841 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:18 crc kubenswrapper[4957]: E0123 10:53:18.770939 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:19 crc kubenswrapper[4957]: I0123 10:53:19.769726 4957 scope.go:117] "RemoveContainer" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" Jan 23 10:53:19 crc kubenswrapper[4957]: E0123 10:53:19.769921 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:53:20 crc kubenswrapper[4957]: I0123 10:53:20.769136 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:20 crc kubenswrapper[4957]: I0123 10:53:20.769208 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:20 crc kubenswrapper[4957]: E0123 10:53:20.770848 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:20 crc kubenswrapper[4957]: I0123 10:53:20.770871 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:20 crc kubenswrapper[4957]: I0123 10:53:20.770936 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:20 crc kubenswrapper[4957]: E0123 10:53:20.771096 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:20 crc kubenswrapper[4957]: E0123 10:53:20.771558 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:20 crc kubenswrapper[4957]: E0123 10:53:20.771636 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:22 crc kubenswrapper[4957]: I0123 10:53:22.769857 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:22 crc kubenswrapper[4957]: I0123 10:53:22.769972 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:22 crc kubenswrapper[4957]: I0123 10:53:22.770095 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:22 crc kubenswrapper[4957]: E0123 10:53:22.770013 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:22 crc kubenswrapper[4957]: I0123 10:53:22.770266 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:22 crc kubenswrapper[4957]: E0123 10:53:22.770415 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:22 crc kubenswrapper[4957]: E0123 10:53:22.770519 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:22 crc kubenswrapper[4957]: E0123 10:53:22.770642 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.365564 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/1.log" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.366753 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/0.log" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.366820 4957 generic.go:334] "Generic (PLEG): container finished" podID="233fdd78-4010-4fe8-9068-ee47d8ff25d1" containerID="a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c" exitCode=1 Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.366864 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerDied","Data":"a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c"} Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.366912 4957 scope.go:117] "RemoveContainer" containerID="d6851e0ec1040550b8c9edb1b85213d2c849e381fae6b0f09c9a7247bd9c5088" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.367524 4957 scope.go:117] "RemoveContainer" containerID="a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c" Jan 23 10:53:24 crc kubenswrapper[4957]: E0123 10:53:24.367804 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tlz2g_openshift-multus(233fdd78-4010-4fe8-9068-ee47d8ff25d1)\"" pod="openshift-multus/multus-tlz2g" podUID="233fdd78-4010-4fe8-9068-ee47d8ff25d1" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.399206 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7bvv" podStartSLOduration=95.399175412 podStartE2EDuration="1m35.399175412s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:15.352922346 +0000 UTC m=+104.890175043" watchObservedRunningTime="2026-01-23 10:53:24.399175412 +0000 UTC m=+113.936428119" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.769439 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.769517 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.769556 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:24 crc kubenswrapper[4957]: E0123 10:53:24.769672 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:24 crc kubenswrapper[4957]: I0123 10:53:24.769702 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:24 crc kubenswrapper[4957]: E0123 10:53:24.769798 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:24 crc kubenswrapper[4957]: E0123 10:53:24.769900 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:24 crc kubenswrapper[4957]: E0123 10:53:24.769970 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:25 crc kubenswrapper[4957]: I0123 10:53:25.378140 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/1.log" Jan 23 10:53:26 crc kubenswrapper[4957]: I0123 10:53:26.769020 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:26 crc kubenswrapper[4957]: I0123 10:53:26.769119 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:26 crc kubenswrapper[4957]: E0123 10:53:26.769156 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:26 crc kubenswrapper[4957]: I0123 10:53:26.769204 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:26 crc kubenswrapper[4957]: I0123 10:53:26.769238 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:26 crc kubenswrapper[4957]: E0123 10:53:26.769466 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:26 crc kubenswrapper[4957]: E0123 10:53:26.769607 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:26 crc kubenswrapper[4957]: E0123 10:53:26.769731 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:27 crc kubenswrapper[4957]: I0123 10:53:27.903611 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:27 crc kubenswrapper[4957]: E0123 10:53:27.903742 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:27 crc kubenswrapper[4957]: I0123 10:53:27.903982 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:27 crc kubenswrapper[4957]: E0123 10:53:27.904317 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:28 crc kubenswrapper[4957]: I0123 10:53:28.769364 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:28 crc kubenswrapper[4957]: E0123 10:53:28.769500 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:28 crc kubenswrapper[4957]: I0123 10:53:28.769543 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:28 crc kubenswrapper[4957]: E0123 10:53:28.769764 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:29 crc kubenswrapper[4957]: I0123 10:53:29.769447 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:29 crc kubenswrapper[4957]: E0123 10:53:29.769574 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:29 crc kubenswrapper[4957]: I0123 10:53:29.769458 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:29 crc kubenswrapper[4957]: E0123 10:53:29.769734 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:30 crc kubenswrapper[4957]: I0123 10:53:30.769139 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:30 crc kubenswrapper[4957]: I0123 10:53:30.769247 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:30 crc kubenswrapper[4957]: E0123 10:53:30.771150 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:30 crc kubenswrapper[4957]: E0123 10:53:30.771346 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:30 crc kubenswrapper[4957]: E0123 10:53:30.783966 4957 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 23 10:53:30 crc kubenswrapper[4957]: E0123 10:53:30.883391 4957 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 23 10:53:31 crc kubenswrapper[4957]: I0123 10:53:31.769576 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:31 crc kubenswrapper[4957]: I0123 10:53:31.769890 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:31 crc kubenswrapper[4957]: E0123 10:53:31.770125 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:31 crc kubenswrapper[4957]: E0123 10:53:31.770228 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:31 crc kubenswrapper[4957]: I0123 10:53:31.770490 4957 scope.go:117] "RemoveContainer" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" Jan 23 10:53:31 crc kubenswrapper[4957]: E0123 10:53:31.770643 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8hcw_openshift-ovn-kubernetes(87adc28a-89e3-4743-a9f2-098d4a9432d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" Jan 23 10:53:32 crc kubenswrapper[4957]: I0123 10:53:32.769616 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:32 crc kubenswrapper[4957]: I0123 10:53:32.769813 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:32 crc kubenswrapper[4957]: E0123 10:53:32.769948 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:32 crc kubenswrapper[4957]: E0123 10:53:32.770406 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:33 crc kubenswrapper[4957]: I0123 10:53:33.769716 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:33 crc kubenswrapper[4957]: I0123 10:53:33.769719 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:33 crc kubenswrapper[4957]: E0123 10:53:33.770099 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:33 crc kubenswrapper[4957]: E0123 10:53:33.769914 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:34 crc kubenswrapper[4957]: I0123 10:53:34.769032 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:34 crc kubenswrapper[4957]: E0123 10:53:34.769238 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:34 crc kubenswrapper[4957]: I0123 10:53:34.769368 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:34 crc kubenswrapper[4957]: E0123 10:53:34.769548 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:35 crc kubenswrapper[4957]: I0123 10:53:35.768879 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:35 crc kubenswrapper[4957]: I0123 10:53:35.768907 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:35 crc kubenswrapper[4957]: E0123 10:53:35.769054 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:35 crc kubenswrapper[4957]: E0123 10:53:35.769189 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:35 crc kubenswrapper[4957]: E0123 10:53:35.885887 4957 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 23 10:53:36 crc kubenswrapper[4957]: I0123 10:53:36.769731 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:36 crc kubenswrapper[4957]: I0123 10:53:36.769776 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:36 crc kubenswrapper[4957]: E0123 10:53:36.769907 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:36 crc kubenswrapper[4957]: E0123 10:53:36.769992 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:37 crc kubenswrapper[4957]: I0123 10:53:37.769417 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:37 crc kubenswrapper[4957]: I0123 10:53:37.769489 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:37 crc kubenswrapper[4957]: E0123 10:53:37.769576 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:37 crc kubenswrapper[4957]: E0123 10:53:37.769655 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:38 crc kubenswrapper[4957]: I0123 10:53:38.769434 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:38 crc kubenswrapper[4957]: I0123 10:53:38.769553 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:38 crc kubenswrapper[4957]: E0123 10:53:38.769622 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:38 crc kubenswrapper[4957]: E0123 10:53:38.769794 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:38 crc kubenswrapper[4957]: I0123 10:53:38.770416 4957 scope.go:117] "RemoveContainer" containerID="a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c" Jan 23 10:53:39 crc kubenswrapper[4957]: I0123 10:53:39.769485 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:39 crc kubenswrapper[4957]: E0123 10:53:39.769993 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:39 crc kubenswrapper[4957]: I0123 10:53:39.769483 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:39 crc kubenswrapper[4957]: E0123 10:53:39.770369 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:39 crc kubenswrapper[4957]: I0123 10:53:39.948032 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/1.log" Jan 23 10:53:39 crc kubenswrapper[4957]: I0123 10:53:39.948146 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerStarted","Data":"8e5c8d9deccce35da00836243a4c325855cbf167f74a231795eea7aff84803a4"} Jan 23 10:53:40 crc kubenswrapper[4957]: I0123 10:53:40.769611 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:40 crc kubenswrapper[4957]: I0123 10:53:40.769684 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:40 crc kubenswrapper[4957]: E0123 10:53:40.770846 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:40 crc kubenswrapper[4957]: E0123 10:53:40.771008 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:40 crc kubenswrapper[4957]: E0123 10:53:40.886517 4957 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 23 10:53:41 crc kubenswrapper[4957]: I0123 10:53:41.769711 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:41 crc kubenswrapper[4957]: I0123 10:53:41.769711 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:41 crc kubenswrapper[4957]: E0123 10:53:41.769946 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:41 crc kubenswrapper[4957]: E0123 10:53:41.770089 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:42 crc kubenswrapper[4957]: I0123 10:53:42.769503 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:42 crc kubenswrapper[4957]: I0123 10:53:42.769591 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:42 crc kubenswrapper[4957]: E0123 10:53:42.769699 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:42 crc kubenswrapper[4957]: E0123 10:53:42.770025 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:42 crc kubenswrapper[4957]: I0123 10:53:42.771111 4957 scope.go:117] "RemoveContainer" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" Jan 23 10:53:42 crc kubenswrapper[4957]: I0123 10:53:42.964608 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/3.log" Jan 23 10:53:42 crc kubenswrapper[4957]: I0123 10:53:42.970447 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerStarted","Data":"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3"} Jan 23 10:53:42 crc kubenswrapper[4957]: I0123 10:53:42.970915 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:53:43 crc kubenswrapper[4957]: I0123 10:53:43.021400 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podStartSLOduration=113.021378168 podStartE2EDuration="1m53.021378168s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:43.017454846 +0000 UTC m=+132.554707563" watchObservedRunningTime="2026-01-23 10:53:43.021378168 +0000 UTC m=+132.558630865" Jan 23 10:53:43 crc kubenswrapper[4957]: I0123 10:53:43.703630 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5fxfb"] Jan 23 10:53:43 crc kubenswrapper[4957]: I0123 10:53:43.703735 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:43 crc kubenswrapper[4957]: E0123 10:53:43.703814 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:43 crc kubenswrapper[4957]: I0123 10:53:43.769141 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:43 crc kubenswrapper[4957]: I0123 10:53:43.769155 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:43 crc kubenswrapper[4957]: E0123 10:53:43.769304 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:43 crc kubenswrapper[4957]: E0123 10:53:43.769391 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:44 crc kubenswrapper[4957]: I0123 10:53:44.769493 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:44 crc kubenswrapper[4957]: E0123 10:53:44.770153 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:45 crc kubenswrapper[4957]: I0123 10:53:45.769777 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:45 crc kubenswrapper[4957]: I0123 10:53:45.769840 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:45 crc kubenswrapper[4957]: I0123 10:53:45.769899 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:45 crc kubenswrapper[4957]: E0123 10:53:45.769932 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:45 crc kubenswrapper[4957]: E0123 10:53:45.770048 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:45 crc kubenswrapper[4957]: E0123 10:53:45.770191 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:45 crc kubenswrapper[4957]: E0123 10:53:45.887756 4957 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 23 10:53:46 crc kubenswrapper[4957]: I0123 10:53:46.769837 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:46 crc kubenswrapper[4957]: E0123 10:53:46.770062 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:47 crc kubenswrapper[4957]: I0123 10:53:47.769001 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:47 crc kubenswrapper[4957]: I0123 10:53:47.769027 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:47 crc kubenswrapper[4957]: E0123 10:53:47.769209 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:47 crc kubenswrapper[4957]: I0123 10:53:47.769036 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:47 crc kubenswrapper[4957]: E0123 10:53:47.769420 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:47 crc kubenswrapper[4957]: E0123 10:53:47.769517 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:48 crc kubenswrapper[4957]: I0123 10:53:48.769269 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:48 crc kubenswrapper[4957]: E0123 10:53:48.769537 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:49 crc kubenswrapper[4957]: I0123 10:53:49.768795 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:49 crc kubenswrapper[4957]: I0123 10:53:49.768843 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:49 crc kubenswrapper[4957]: E0123 10:53:49.768947 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5fxfb" podUID="87775b38-0664-48f6-8857-7568c135bd79" Jan 23 10:53:49 crc kubenswrapper[4957]: E0123 10:53:49.769236 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 23 10:53:49 crc kubenswrapper[4957]: I0123 10:53:49.769418 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:49 crc kubenswrapper[4957]: E0123 10:53:49.769552 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 23 10:53:50 crc kubenswrapper[4957]: I0123 10:53:50.768976 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:50 crc kubenswrapper[4957]: E0123 10:53:50.770976 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.769420 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.769476 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.769421 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.773237 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.773237 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.773346 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.776698 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.777106 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 23 10:53:51 crc kubenswrapper[4957]: I0123 10:53:51.778520 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 23 10:53:52 crc kubenswrapper[4957]: I0123 10:53:52.769314 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:54 crc kubenswrapper[4957]: I0123 10:53:54.943362 4957 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 23 10:53:54 crc kubenswrapper[4957]: I0123 10:53:54.994546 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5t9v6"] Jan 23 10:53:54 crc kubenswrapper[4957]: I0123 10:53:54.995119 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqcmp"] Jan 23 10:53:54 crc kubenswrapper[4957]: I0123 10:53:54.995424 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:54 crc kubenswrapper[4957]: I0123 10:53:54.995766 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.002401 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.003528 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.003853 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.005181 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.005218 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.013453 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lrnc4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.014935 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.015341 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.015586 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.018449 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.018643 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.018829 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.019106 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.019485 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.019859 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.020063 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.020126 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-d94z6"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.020266 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.020489 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.023345 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d94z6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.025716 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.025947 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.026108 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.026167 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.026392 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.026689 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.027256 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.027483 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.028107 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.028228 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.028252 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.030048 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-frdsx"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.030445 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.054767 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wtmvm"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.069849 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.070544 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.072504 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.072676 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.072875 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073060 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073198 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073560 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073624 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073637 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073659 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073746 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073829 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073837 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.073941 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074044 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074095 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d6nt4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074128 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074163 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074192 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074224 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074133 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074354 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074428 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074438 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074466 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074679 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074786 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.074976 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075237 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075380 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075391 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075416 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075490 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075513 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075556 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075578 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075588 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075644 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075816 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.075962 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.076071 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.076565 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.076331 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.076428 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.077193 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.077871 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.077911 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.077978 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.077985 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.080598 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.077998 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.078024 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.078064 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.078118 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.078145 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.078205 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.078550 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.078575 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.086127 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2z4v2"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.088325 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.089725 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rfzwr"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.090318 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091288 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091323 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-serving-cert\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091344 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6bb12583-2035-41bb-8847-fd46198f7ede-node-pullsecrets\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091366 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-config\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091379 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091395 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bd9bee-ee36-4280-948c-f39f88acaa70-serving-cert\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091408 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-etcd-client\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091422 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-client-ca\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091438 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmqwf\" (UniqueName: \"kubernetes.io/projected/4238c53f-acdc-409f-8d1d-e4608fe5c239-kube-api-access-xmqwf\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091458 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091473 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-serving-cert\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091491 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-service-ca\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091506 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091528 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-etcd-serving-ca\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091549 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpvql\" (UniqueName: \"kubernetes.io/projected/f6bd9bee-ee36-4280-948c-f39f88acaa70-kube-api-access-fpvql\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091563 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b721a986-d921-41f6-ba96-47647e168858-console-serving-cert\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091580 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-console-config\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091594 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-encryption-config\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091608 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6adf64-ce45-4965-8c43-0f0215c18109-serving-cert\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091626 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-oauth-serving-cert\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091642 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-client-ca\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091656 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rgpg\" (UniqueName: \"kubernetes.io/projected/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-kube-api-access-8rgpg\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091672 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn2td\" (UniqueName: \"kubernetes.io/projected/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-kube-api-access-bn2td\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091687 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk58c\" (UniqueName: \"kubernetes.io/projected/ea6adf64-ce45-4965-8c43-0f0215c18109-kube-api-access-lk58c\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091701 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-audit-dir\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091718 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10230a4-235f-4c7d-9057-cc613fab04fc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091732 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-audit\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091745 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091761 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df8md\" (UniqueName: \"kubernetes.io/projected/d10230a4-235f-4c7d-9057-cc613fab04fc-kube-api-access-df8md\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091778 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-encryption-config\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091794 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4238c53f-acdc-409f-8d1d-e4608fe5c239-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091830 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hz5f\" (UniqueName: \"kubernetes.io/projected/6bb12583-2035-41bb-8847-fd46198f7ede-kube-api-access-2hz5f\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091885 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-config\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091908 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zksk2\" (UniqueName: \"kubernetes.io/projected/b721a986-d921-41f6-ba96-47647e168858-kube-api-access-zksk2\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091929 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-serving-cert\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091971 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-trusted-ca-bundle\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.091999 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d10230a4-235f-4c7d-9057-cc613fab04fc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092022 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-etcd-client\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092060 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-config\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092144 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4238c53f-acdc-409f-8d1d-e4608fe5c239-images\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092166 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bb12583-2035-41bb-8847-fd46198f7ede-audit-dir\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092181 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4238c53f-acdc-409f-8d1d-e4608fe5c239-config\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092198 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b721a986-d921-41f6-ba96-47647e168858-console-oauth-config\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092218 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-audit-policies\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092234 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkr47\" (UniqueName: \"kubernetes.io/projected/92ace232-9107-480d-a4cd-27d7bd114efd-kube-api-access-lkr47\") pod \"downloads-7954f5f757-d94z6\" (UID: \"92ace232-9107-480d-a4cd-27d7bd114efd\") " pod="openshift-console/downloads-7954f5f757-d94z6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.092304 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-image-import-ca\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.095562 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.095896 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.095923 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.096042 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.096579 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bcjxk"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.096720 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.097190 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.097623 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.097885 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.110537 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.111781 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.111971 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.112120 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.112252 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.112333 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.112455 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.111786 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.112651 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.112266 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.112847 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.113006 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.113725 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.113973 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.114005 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.114088 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.114267 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.114476 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.114799 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.114976 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.115057 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.115207 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.115369 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.115802 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.115809 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.115992 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.116098 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.116714 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.157741 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.158106 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6brn9"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.158808 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.159383 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hgc64"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.159828 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.160470 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.160919 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.161060 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.161467 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.162534 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.162566 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.162984 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.164600 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.167837 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.168431 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.168564 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqcmp"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.168649 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.168945 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.172462 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.173198 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.174790 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.175436 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.176065 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cfk5"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.176844 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.177658 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8p9zs"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.178292 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.178624 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.180483 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2zqvt"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.181375 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.182212 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.182757 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.184706 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.185684 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.187096 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.187754 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.188723 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.189353 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.190172 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.190880 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.191778 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192409 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192818 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c210711d-5840-4b58-948c-e9d19f041ee2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192845 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-policies\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192872 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4238c53f-acdc-409f-8d1d-e4608fe5c239-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192892 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zksk2\" (UniqueName: \"kubernetes.io/projected/b721a986-d921-41f6-ba96-47647e168858-kube-api-access-zksk2\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192912 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hz5f\" (UniqueName: \"kubernetes.io/projected/6bb12583-2035-41bb-8847-fd46198f7ede-kube-api-access-2hz5f\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192931 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-config\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192948 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kxxj\" (UniqueName: \"kubernetes.io/projected/808262b6-32ca-42f1-938e-f009dac6b1db-kube-api-access-9kxxj\") pod \"dns-operator-744455d44c-rfzwr\" (UID: \"808262b6-32ca-42f1-938e-f009dac6b1db\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192965 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-serving-cert\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.192981 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe95d360-2a00-47e5-b577-817575b85417-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193001 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193018 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193034 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d10230a4-235f-4c7d-9057-cc613fab04fc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193052 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-trusted-ca-bundle\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193085 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/214c7e14-5663-41c3-8a75-4573dff48b63-auth-proxy-config\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193106 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-config\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193120 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-etcd-client\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193137 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-config\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193153 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe95d360-2a00-47e5-b577-817575b85417-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193219 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9af57495-ed2b-4a03-8206-b26948dfa61a-trusted-ca\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193248 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193265 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-config\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193302 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzvwp\" (UniqueName: \"kubernetes.io/projected/9af57495-ed2b-4a03-8206-b26948dfa61a-kube-api-access-bzvwp\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193326 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193340 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193354 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193370 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhct5\" (UniqueName: \"kubernetes.io/projected/c210711d-5840-4b58-948c-e9d19f041ee2-kube-api-access-vhct5\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193389 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k28ws\" (UniqueName: \"kubernetes.io/projected/7183c961-6fa6-49c8-909a-7defa10a655a-kube-api-access-k28ws\") pod \"cluster-samples-operator-665b6dd947-7t6gz\" (UID: \"7183c961-6fa6-49c8-909a-7defa10a655a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193404 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af57495-ed2b-4a03-8206-b26948dfa61a-config\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193436 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bb12583-2035-41bb-8847-fd46198f7ede-audit-dir\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193458 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193481 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv67w\" (UniqueName: \"kubernetes.io/projected/f0f6f781-8789-47c0-badf-5f4a9dc36621-kube-api-access-sv67w\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193502 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4238c53f-acdc-409f-8d1d-e4608fe5c239-images\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193517 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4238c53f-acdc-409f-8d1d-e4608fe5c239-config\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193531 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b721a986-d921-41f6-ba96-47647e168858-console-oauth-config\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193893 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-audit-policies\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193914 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkr47\" (UniqueName: \"kubernetes.io/projected/92ace232-9107-480d-a4cd-27d7bd114efd-kube-api-access-lkr47\") pod \"downloads-7954f5f757-d94z6\" (UID: \"92ace232-9107-480d-a4cd-27d7bd114efd\") " pod="openshift-console/downloads-7954f5f757-d94z6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193933 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-image-import-ca\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193948 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe95d360-2a00-47e5-b577-817575b85417-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193971 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af57495-ed2b-4a03-8206-b26948dfa61a-serving-cert\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.193986 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c210711d-5840-4b58-948c-e9d19f041ee2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194010 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-dir\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194031 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808262b6-32ca-42f1-938e-f009dac6b1db-metrics-tls\") pod \"dns-operator-744455d44c-rfzwr\" (UID: \"808262b6-32ca-42f1-938e-f009dac6b1db\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194046 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194065 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-serving-cert\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194083 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6bb12583-2035-41bb-8847-fd46198f7ede-node-pullsecrets\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194101 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-config\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194119 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da48191c-513e-4069-b4b7-8f6de3363326-etcd-client\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194141 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-service-ca-bundle\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194167 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fb7\" (UniqueName: \"kubernetes.io/projected/fe95d360-2a00-47e5-b577-817575b85417-kube-api-access-t5fb7\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194185 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-metrics-tls\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194202 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bd9bee-ee36-4280-948c-f39f88acaa70-serving-cert\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194223 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194243 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-trusted-ca\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194262 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-client-ca\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194347 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194365 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f6f781-8789-47c0-badf-5f4a9dc36621-serving-cert\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194381 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-etcd-client\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194396 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da48191c-513e-4069-b4b7-8f6de3363326-serving-cert\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194412 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194429 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmqwf\" (UniqueName: \"kubernetes.io/projected/4238c53f-acdc-409f-8d1d-e4608fe5c239-kube-api-access-xmqwf\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194444 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194459 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194474 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-serving-cert\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194490 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-service-ca\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.194505 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.195577 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.195693 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4238c53f-acdc-409f-8d1d-e4608fe5c239-config\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.195744 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bb12583-2035-41bb-8847-fd46198f7ede-audit-dir\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.196394 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4238c53f-acdc-409f-8d1d-e4608fe5c239-images\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.196495 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.196874 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-audit-policies\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197091 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197133 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7183c961-6fa6-49c8-909a-7defa10a655a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7t6gz\" (UID: \"7183c961-6fa6-49c8-909a-7defa10a655a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197179 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-etcd-serving-ca\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197208 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6b2q\" (UniqueName: \"kubernetes.io/projected/da48191c-513e-4069-b4b7-8f6de3363326-kube-api-access-m6b2q\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197257 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpvql\" (UniqueName: \"kubernetes.io/projected/f6bd9bee-ee36-4280-948c-f39f88acaa70-kube-api-access-fpvql\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197299 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b721a986-d921-41f6-ba96-47647e168858-console-serving-cert\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197325 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-kube-api-access-t5bmn\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197353 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-console-config\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197415 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klbjf\" (UniqueName: \"kubernetes.io/projected/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-kube-api-access-klbjf\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197441 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197473 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-encryption-config\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197604 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6adf64-ce45-4965-8c43-0f0215c18109-serving-cert\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197635 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197660 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rgpg\" (UniqueName: \"kubernetes.io/projected/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-kube-api-access-8rgpg\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197683 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-oauth-serving-cert\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197709 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-client-ca\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197734 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk58c\" (UniqueName: \"kubernetes.io/projected/ea6adf64-ce45-4965-8c43-0f0215c18109-kube-api-access-lk58c\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197758 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-etcd-ca\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197784 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214c7e14-5663-41c3-8a75-4573dff48b63-config\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197809 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197836 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn2td\" (UniqueName: \"kubernetes.io/projected/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-kube-api-access-bn2td\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197858 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-audit-dir\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197879 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-etcd-service-ca\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197900 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-config\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197915 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197983 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-image-import-ca\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.198077 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6bb12583-2035-41bb-8847-fd46198f7ede-node-pullsecrets\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.198639 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197923 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.199156 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.199454 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.199734 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.200022 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-config\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.197782 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-service-ca\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.200516 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-oauth-serving-cert\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.200570 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5t9v6"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.200741 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-etcd-serving-ca\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.200812 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-trusted-ca-bundle\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.201090 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-audit-dir\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.201361 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-client-ca\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.202428 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/214c7e14-5663-41c3-8a75-4573dff48b63-machine-approver-tls\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.202507 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10230a4-235f-4c7d-9057-cc613fab04fc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.202532 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-audit\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.202551 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-encryption-config\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.202570 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.202588 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g647f\" (UniqueName: \"kubernetes.io/projected/214c7e14-5663-41c3-8a75-4573dff48b63-kube-api-access-g647f\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.202611 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df8md\" (UniqueName: \"kubernetes.io/projected/d10230a4-235f-4c7d-9057-cc613fab04fc-kube-api-access-df8md\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.203184 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-config\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.203217 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6bb12583-2035-41bb-8847-fd46198f7ede-audit\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.204369 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b721a986-d921-41f6-ba96-47647e168858-console-config\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.204580 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-encryption-config\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.204733 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4238c53f-acdc-409f-8d1d-e4608fe5c239-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.204916 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-client-ca\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.204992 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d10230a4-235f-4c7d-9057-cc613fab04fc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.205069 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-config\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.206066 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b721a986-d921-41f6-ba96-47647e168858-console-oauth-config\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.206161 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bd9bee-ee36-4280-948c-f39f88acaa70-serving-cert\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.206968 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-etcd-client\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.207011 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-serving-cert\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.207082 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6adf64-ce45-4965-8c43-0f0215c18109-serving-cert\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.207260 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-etcd-client\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.208364 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-encryption-config\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.208411 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10230a4-235f-4c7d-9057-cc613fab04fc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.208575 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-serving-cert\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.209754 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb12583-2035-41bb-8847-fd46198f7ede-serving-cert\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.210018 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b721a986-d921-41f6-ba96-47647e168858-console-serving-cert\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.210177 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lrnc4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.211498 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.237151 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.240471 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.243559 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.245166 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.247715 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.251078 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d94z6"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.252440 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z8lhm"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.253551 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wtmvm"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.253640 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.254616 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.255619 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.256879 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rfzwr"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.259135 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cfk5"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.260717 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.261211 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.262866 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d6nt4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.264180 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.265293 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.267476 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.267502 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.268808 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bcjxk"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.270694 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hgc64"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.271670 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2z4v2"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.272880 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.274060 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.275303 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.276185 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.277455 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-frdsx"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.277564 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.278182 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.279363 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.280292 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2zqvt"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.281290 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-n7rd5"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.282087 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.282746 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.283919 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8p9zs"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.285844 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.287687 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.288702 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.289746 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.290893 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-scpzq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.291699 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.292195 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.293258 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.294559 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z8lhm"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.295534 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-scpzq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.296493 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.297435 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rxncn"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.297635 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.298198 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.298627 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rxncn"] Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305250 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305300 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96sbr\" (UniqueName: \"kubernetes.io/projected/410302c2-c1e9-4b31-8edb-a62078470e7f-kube-api-access-96sbr\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305323 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-plugins-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305366 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305385 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/410302c2-c1e9-4b31-8edb-a62078470e7f-tmpfs\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305401 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c810606c-cfa3-4391-bbc2-7e6ff647393c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305420 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-etcd-service-ca\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305446 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-mountpoint-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305471 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g647f\" (UniqueName: \"kubernetes.io/projected/214c7e14-5663-41c3-8a75-4573dff48b63-kube-api-access-g647f\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305488 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-srv-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305502 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-registration-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305525 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwsx2\" (UniqueName: \"kubernetes.io/projected/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-kube-api-access-lwsx2\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305544 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5b14648d-48f6-42d6-8bd3-91ddcc54bfbc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2zqvt\" (UID: \"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305564 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn8vq\" (UniqueName: \"kubernetes.io/projected/12c22d2e-17e6-4c9a-83f0-24225d15d476-kube-api-access-jn8vq\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305578 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305594 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe95d360-2a00-47e5-b577-817575b85417-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305609 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305625 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d5670ad-73cc-493e-a54f-e684a6f00f06-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jp7cg\" (UID: \"7d5670ad-73cc-493e-a54f-e684a6f00f06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305645 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe95d360-2a00-47e5-b577-817575b85417-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305668 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-config\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305702 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd38561-c9ad-4248-adfe-b62f67cf4221-config\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305725 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k28ws\" (UniqueName: \"kubernetes.io/projected/7183c961-6fa6-49c8-909a-7defa10a655a-kube-api-access-k28ws\") pod \"cluster-samples-operator-665b6dd947-7t6gz\" (UID: \"7183c961-6fa6-49c8-909a-7defa10a655a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305747 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4sm\" (UniqueName: \"kubernetes.io/projected/9f5fe703-fd89-4007-ae49-96ae0202d69c-kube-api-access-zj4sm\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305773 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305803 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c22d2e-17e6-4c9a-83f0-24225d15d476-proxy-tls\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305827 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-stats-auth\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305848 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-profile-collector-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305872 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808262b6-32ca-42f1-938e-f009dac6b1db-metrics-tls\") pod \"dns-operator-744455d44c-rfzwr\" (UID: \"808262b6-32ca-42f1-938e-f009dac6b1db\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305904 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da48191c-513e-4069-b4b7-8f6de3363326-etcd-client\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305925 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-service-ca-bundle\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305950 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnqq\" (UniqueName: \"kubernetes.io/projected/7d5670ad-73cc-493e-a54f-e684a6f00f06-kube-api-access-9rnqq\") pod \"package-server-manager-789f6589d5-jp7cg\" (UID: \"7d5670ad-73cc-493e-a54f-e684a6f00f06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305973 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh5c\" (UniqueName: \"kubernetes.io/projected/5cdf7d3f-3edb-4741-a232-1f4d969417ce-kube-api-access-7bh5c\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.305992 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-trusted-ca\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306006 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-socket-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306023 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306038 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f6f781-8789-47c0-badf-5f4a9dc36621-serving-cert\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306053 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da48191c-513e-4069-b4b7-8f6de3363326-serving-cert\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306074 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306090 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7183c961-6fa6-49c8-909a-7defa10a655a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7t6gz\" (UID: \"7183c961-6fa6-49c8-909a-7defa10a655a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306106 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c810606c-cfa3-4391-bbc2-7e6ff647393c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306141 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klbjf\" (UniqueName: \"kubernetes.io/projected/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-kube-api-access-klbjf\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306163 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5fe703-fd89-4007-ae49-96ae0202d69c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306179 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12c22d2e-17e6-4c9a-83f0-24225d15d476-images\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306194 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c810606c-cfa3-4391-bbc2-7e6ff647393c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306210 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czhvg\" (UniqueName: \"kubernetes.io/projected/5b14648d-48f6-42d6-8bd3-91ddcc54bfbc-kube-api-access-czhvg\") pod \"multus-admission-controller-857f4d67dd-2zqvt\" (UID: \"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306228 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214c7e14-5663-41c3-8a75-4573dff48b63-config\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306248 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-etcd-ca\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306265 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-config\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306322 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306342 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6tvk\" (UniqueName: \"kubernetes.io/projected/6cd38561-c9ad-4248-adfe-b62f67cf4221-kube-api-access-d6tvk\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306359 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/214c7e14-5663-41c3-8a75-4573dff48b63-machine-approver-tls\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306382 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c210711d-5840-4b58-948c-e9d19f041ee2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306398 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-policies\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306418 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kxxj\" (UniqueName: \"kubernetes.io/projected/808262b6-32ca-42f1-938e-f009dac6b1db-kube-api-access-9kxxj\") pod \"dns-operator-744455d44c-rfzwr\" (UID: \"808262b6-32ca-42f1-938e-f009dac6b1db\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306450 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306487 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/214c7e14-5663-41c3-8a75-4573dff48b63-auth-proxy-config\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306504 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-config\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306520 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9af57495-ed2b-4a03-8206-b26948dfa61a-trusted-ca\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306536 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306553 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzvwp\" (UniqueName: \"kubernetes.io/projected/9af57495-ed2b-4a03-8206-b26948dfa61a-kube-api-access-bzvwp\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306569 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306585 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306603 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306620 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af57495-ed2b-4a03-8206-b26948dfa61a-config\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306638 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhct5\" (UniqueName: \"kubernetes.io/projected/c210711d-5840-4b58-948c-e9d19f041ee2-kube-api-access-vhct5\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306656 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkn8\" (UniqueName: \"kubernetes.io/projected/aad2f022-0b03-41aa-a16d-000844e34eed-kube-api-access-tvkn8\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306684 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv67w\" (UniqueName: \"kubernetes.io/projected/f0f6f781-8789-47c0-badf-5f4a9dc36621-kube-api-access-sv67w\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306709 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe95d360-2a00-47e5-b577-817575b85417-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306724 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af57495-ed2b-4a03-8206-b26948dfa61a-serving-cert\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306758 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c210711d-5840-4b58-948c-e9d19f041ee2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306774 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cdf7d3f-3edb-4741-a232-1f4d969417ce-signing-key\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306789 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-dir\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306813 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fb7\" (UniqueName: \"kubernetes.io/projected/fe95d360-2a00-47e5-b577-817575b85417-kube-api-access-t5fb7\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306827 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-metrics-tls\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306853 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65vxk\" (UniqueName: \"kubernetes.io/projected/beac68ed-ac4c-424e-aaff-79a4f7f246d0-kube-api-access-65vxk\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306873 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-default-certificate\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306888 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beac68ed-ac4c-424e-aaff-79a4f7f246d0-service-ca-bundle\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306905 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306922 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306937 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12c22d2e-17e6-4c9a-83f0-24225d15d476-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306953 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-csi-data-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.306979 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd38561-c9ad-4248-adfe-b62f67cf4221-serving-cert\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.307002 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-metrics-certs\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.307018 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6b2q\" (UniqueName: \"kubernetes.io/projected/da48191c-513e-4069-b4b7-8f6de3363326-kube-api-access-m6b2q\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.307033 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-webhook-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.307101 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe703-fd89-4007-ae49-96ae0202d69c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.307121 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-kube-api-access-t5bmn\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.307137 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cdf7d3f-3edb-4741-a232-1f4d969417ce-signing-cabundle\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.307153 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.308013 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.308886 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.309195 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-etcd-service-ca\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.309965 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214c7e14-5663-41c3-8a75-4573dff48b63-config\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.310057 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-config\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.310114 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-dir\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.310196 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.310507 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/da48191c-513e-4069-b4b7-8f6de3363326-etcd-ca\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.311168 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-config\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.311365 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/214c7e14-5663-41c3-8a75-4573dff48b63-auth-proxy-config\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.311504 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-policies\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.311577 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.311663 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f6f781-8789-47c0-badf-5f4a9dc36621-service-ca-bundle\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.311828 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c210711d-5840-4b58-948c-e9d19f041ee2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.311831 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c210711d-5840-4b58-948c-e9d19f041ee2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.312188 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.312244 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-config\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.312493 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.313758 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.314434 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.314579 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da48191c-513e-4069-b4b7-8f6de3363326-serving-cert\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.314669 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f6f781-8789-47c0-badf-5f4a9dc36621-serving-cert\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.314765 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7183c961-6fa6-49c8-909a-7defa10a655a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7t6gz\" (UID: \"7183c961-6fa6-49c8-909a-7defa10a655a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.315161 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da48191c-513e-4069-b4b7-8f6de3363326-etcd-client\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.315404 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.315617 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.315829 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/214c7e14-5663-41c3-8a75-4573dff48b63-machine-approver-tls\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.316050 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.316167 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.316197 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.317397 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.322854 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808262b6-32ca-42f1-938e-f009dac6b1db-metrics-tls\") pod \"dns-operator-744455d44c-rfzwr\" (UID: \"808262b6-32ca-42f1-938e-f009dac6b1db\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.337804 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.342045 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-metrics-tls\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.362897 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.374188 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-trusted-ca\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.378155 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.404169 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.407871 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65vxk\" (UniqueName: \"kubernetes.io/projected/beac68ed-ac4c-424e-aaff-79a4f7f246d0-kube-api-access-65vxk\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.407973 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-default-certificate\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408053 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beac68ed-ac4c-424e-aaff-79a4f7f246d0-service-ca-bundle\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408146 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12c22d2e-17e6-4c9a-83f0-24225d15d476-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408234 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-csi-data-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408340 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-metrics-certs\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408452 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-csi-data-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408478 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd38561-c9ad-4248-adfe-b62f67cf4221-serving-cert\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408570 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-webhook-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408616 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe703-fd89-4007-ae49-96ae0202d69c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408652 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cdf7d3f-3edb-4741-a232-1f4d969417ce-signing-cabundle\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408689 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96sbr\" (UniqueName: \"kubernetes.io/projected/410302c2-c1e9-4b31-8edb-a62078470e7f-kube-api-access-96sbr\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408722 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-plugins-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408755 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/410302c2-c1e9-4b31-8edb-a62078470e7f-tmpfs\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408781 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c810606c-cfa3-4391-bbc2-7e6ff647393c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408807 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-mountpoint-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408842 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-srv-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408866 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-registration-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408891 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5b14648d-48f6-42d6-8bd3-91ddcc54bfbc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2zqvt\" (UID: \"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408914 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwsx2\" (UniqueName: \"kubernetes.io/projected/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-kube-api-access-lwsx2\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408915 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-mountpoint-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.408942 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn8vq\" (UniqueName: \"kubernetes.io/projected/12c22d2e-17e6-4c9a-83f0-24225d15d476-kube-api-access-jn8vq\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409035 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-plugins-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409027 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12c22d2e-17e6-4c9a-83f0-24225d15d476-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409077 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409116 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d5670ad-73cc-493e-a54f-e684a6f00f06-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jp7cg\" (UID: \"7d5670ad-73cc-493e-a54f-e684a6f00f06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409155 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-registration-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409158 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd38561-c9ad-4248-adfe-b62f67cf4221-config\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409214 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj4sm\" (UniqueName: \"kubernetes.io/projected/9f5fe703-fd89-4007-ae49-96ae0202d69c-kube-api-access-zj4sm\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409254 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c22d2e-17e6-4c9a-83f0-24225d15d476-proxy-tls\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409300 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-stats-auth\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409325 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-profile-collector-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409354 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnqq\" (UniqueName: \"kubernetes.io/projected/7d5670ad-73cc-493e-a54f-e684a6f00f06-kube-api-access-9rnqq\") pod \"package-server-manager-789f6589d5-jp7cg\" (UID: \"7d5670ad-73cc-493e-a54f-e684a6f00f06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409382 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh5c\" (UniqueName: \"kubernetes.io/projected/5cdf7d3f-3edb-4741-a232-1f4d969417ce-kube-api-access-7bh5c\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409409 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-socket-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409473 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c810606c-cfa3-4391-bbc2-7e6ff647393c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409510 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c810606c-cfa3-4391-bbc2-7e6ff647393c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409537 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5fe703-fd89-4007-ae49-96ae0202d69c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409550 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/410302c2-c1e9-4b31-8edb-a62078470e7f-tmpfs\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409587 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-socket-dir\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409563 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12c22d2e-17e6-4c9a-83f0-24225d15d476-images\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409669 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czhvg\" (UniqueName: \"kubernetes.io/projected/5b14648d-48f6-42d6-8bd3-91ddcc54bfbc-kube-api-access-czhvg\") pod \"multus-admission-controller-857f4d67dd-2zqvt\" (UID: \"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409704 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6tvk\" (UniqueName: \"kubernetes.io/projected/6cd38561-c9ad-4248-adfe-b62f67cf4221-kube-api-access-d6tvk\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409811 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkn8\" (UniqueName: \"kubernetes.io/projected/aad2f022-0b03-41aa-a16d-000844e34eed-kube-api-access-tvkn8\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.409855 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cdf7d3f-3edb-4741-a232-1f4d969417ce-signing-key\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.412354 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe95d360-2a00-47e5-b577-817575b85417-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.417623 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.422144 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af57495-ed2b-4a03-8206-b26948dfa61a-config\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.437623 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.458144 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.464383 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe95d360-2a00-47e5-b577-817575b85417-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.479240 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.497861 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.502448 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af57495-ed2b-4a03-8206-b26948dfa61a-serving-cert\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.522834 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.530010 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9af57495-ed2b-4a03-8206-b26948dfa61a-trusted-ca\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.537949 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.557121 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.598064 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.617820 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.637929 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.657960 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.677873 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.697595 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.703504 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-default-certificate\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.718361 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.738262 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.744207 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-stats-auth\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.757232 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.779604 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.798237 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.818258 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.837827 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.839969 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe703-fd89-4007-ae49-96ae0202d69c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.859248 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.864246 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5fe703-fd89-4007-ae49-96ae0202d69c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.878853 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.898614 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.903885 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beac68ed-ac4c-424e-aaff-79a4f7f246d0-metrics-certs\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.918887 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.938549 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.958053 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.959506 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beac68ed-ac4c-424e-aaff-79a4f7f246d0-service-ca-bundle\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.977817 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.981227 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12c22d2e-17e6-4c9a-83f0-24225d15d476-images\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:55 crc kubenswrapper[4957]: I0123 10:53:55.999317 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.002642 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c22d2e-17e6-4c9a-83f0-24225d15d476-proxy-tls\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.019755 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.038172 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.058706 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.079188 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.098738 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.118951 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.138557 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.144203 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c810606c-cfa3-4391-bbc2-7e6ff647393c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.158841 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.161382 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c810606c-cfa3-4391-bbc2-7e6ff647393c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.179107 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.196123 4957 request.go:700] Waited for 1.018975449s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.197821 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.226986 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.238256 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.258272 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.277876 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.298447 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.318225 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.320927 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5cdf7d3f-3edb-4741-a232-1f4d969417ce-signing-cabundle\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.337477 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.345136 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5cdf7d3f-3edb-4741-a232-1f4d969417ce-signing-key\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.358723 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.378154 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.398119 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.405610 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5b14648d-48f6-42d6-8bd3-91ddcc54bfbc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2zqvt\" (UID: \"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409767 4957 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409818 4957 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409847 4957 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409820 4957 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409901 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-profile-collector-cert podName:aad2f022-0b03-41aa-a16d-000844e34eed nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.90986313 +0000 UTC m=+146.447115867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-profile-collector-cert") pod "catalog-operator-68c6474976-kvtws" (UID: "aad2f022-0b03-41aa-a16d-000844e34eed") : failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409905 4957 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409948 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-webhook-cert podName:410302c2-c1e9-4b31-8edb-a62078470e7f nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.909927562 +0000 UTC m=+146.447180299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-webhook-cert") pod "packageserver-d55dfcdfc-z6lpq" (UID: "410302c2-c1e9-4b31-8edb-a62078470e7f") : failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409967 4957 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409934 4957 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.409984 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-srv-cert podName:aad2f022-0b03-41aa-a16d-000844e34eed nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.909966113 +0000 UTC m=+146.447218850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-srv-cert") pod "catalog-operator-68c6474976-kvtws" (UID: "aad2f022-0b03-41aa-a16d-000844e34eed") : failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.410123 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-apiservice-cert podName:410302c2-c1e9-4b31-8edb-a62078470e7f nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.910085326 +0000 UTC m=+146.447338093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-apiservice-cert") pod "packageserver-d55dfcdfc-z6lpq" (UID: "410302c2-c1e9-4b31-8edb-a62078470e7f") : failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.410176 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6cd38561-c9ad-4248-adfe-b62f67cf4221-config podName:6cd38561-c9ad-4248-adfe-b62f67cf4221 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.910157108 +0000 UTC m=+146.447409965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6cd38561-c9ad-4248-adfe-b62f67cf4221-config") pod "service-ca-operator-777779d784-wt5d2" (UID: "6cd38561-c9ad-4248-adfe-b62f67cf4221") : failed to sync configmap cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.410216 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd38561-c9ad-4248-adfe-b62f67cf4221-serving-cert podName:6cd38561-c9ad-4248-adfe-b62f67cf4221 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.910201439 +0000 UTC m=+146.447454266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6cd38561-c9ad-4248-adfe-b62f67cf4221-serving-cert") pod "service-ca-operator-777779d784-wt5d2" (UID: "6cd38561-c9ad-4248-adfe-b62f67cf4221") : failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.410264 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5670ad-73cc-493e-a54f-e684a6f00f06-package-server-manager-serving-cert podName:7d5670ad-73cc-493e-a54f-e684a6f00f06 nodeName:}" failed. No retries permitted until 2026-01-23 10:53:56.91024555 +0000 UTC m=+146.447498377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/7d5670ad-73cc-493e-a54f-e684a6f00f06-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-jp7cg" (UID: "7d5670ad-73cc-493e-a54f-e684a6f00f06") : failed to sync secret cache: timed out waiting for the condition Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.418318 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.437466 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.457997 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.478323 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.497931 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.518240 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.538680 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.559220 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.578626 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.598892 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.618860 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.638671 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.658383 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.679088 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.699395 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.720138 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.739420 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.789441 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkr47\" (UniqueName: \"kubernetes.io/projected/92ace232-9107-480d-a4cd-27d7bd114efd-kube-api-access-lkr47\") pod \"downloads-7954f5f757-d94z6\" (UID: \"92ace232-9107-480d-a4cd-27d7bd114efd\") " pod="openshift-console/downloads-7954f5f757-d94z6" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.798654 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.805749 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zksk2\" (UniqueName: \"kubernetes.io/projected/b721a986-d921-41f6-ba96-47647e168858-kube-api-access-zksk2\") pod \"console-f9d7485db-frdsx\" (UID: \"b721a986-d921-41f6-ba96-47647e168858\") " pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.832395 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:56 crc kubenswrapper[4957]: E0123 10:53:56.832643 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:55:58.832605088 +0000 UTC m=+268.369857815 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.832897 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.833053 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.833363 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.833461 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.834094 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.835125 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmqwf\" (UniqueName: \"kubernetes.io/projected/4238c53f-acdc-409f-8d1d-e4608fe5c239-kube-api-access-xmqwf\") pod \"machine-api-operator-5694c8668f-lrnc4\" (UID: \"4238c53f-acdc-409f-8d1d-e4608fe5c239\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.836825 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.837934 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.839949 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.854660 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hz5f\" (UniqueName: \"kubernetes.io/projected/6bb12583-2035-41bb-8847-fd46198f7ede-kube-api-access-2hz5f\") pod \"apiserver-76f77b778f-5t9v6\" (UID: \"6bb12583-2035-41bb-8847-fd46198f7ede\") " pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.875200 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rgpg\" (UniqueName: \"kubernetes.io/projected/a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5-kube-api-access-8rgpg\") pod \"openshift-config-operator-7777fb866f-5vdlt\" (UID: \"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.880406 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d94z6" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.894247 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpvql\" (UniqueName: \"kubernetes.io/projected/f6bd9bee-ee36-4280-948c-f39f88acaa70-kube-api-access-fpvql\") pod \"route-controller-manager-6576b87f9c-z7qr7\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.897319 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.908299 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.918981 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.919249 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn2td\" (UniqueName: \"kubernetes.io/projected/f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3-kube-api-access-bn2td\") pod \"apiserver-7bbb656c7d-fh2f4\" (UID: \"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934025 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934447 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd38561-c9ad-4248-adfe-b62f67cf4221-serving-cert\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934496 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-webhook-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934551 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-srv-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934581 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934599 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d5670ad-73cc-493e-a54f-e684a6f00f06-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jp7cg\" (UID: \"7d5670ad-73cc-493e-a54f-e684a6f00f06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934628 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd38561-c9ad-4248-adfe-b62f67cf4221-config\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.934666 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-profile-collector-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.936830 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd38561-c9ad-4248-adfe-b62f67cf4221-config\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.939070 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd38561-c9ad-4248-adfe-b62f67cf4221-serving-cert\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.940675 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-profile-collector-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.942040 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d5670ad-73cc-493e-a54f-e684a6f00f06-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jp7cg\" (UID: \"7d5670ad-73cc-493e-a54f-e684a6f00f06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.942402 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aad2f022-0b03-41aa-a16d-000844e34eed-srv-cert\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.952899 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk58c\" (UniqueName: \"kubernetes.io/projected/ea6adf64-ce45-4965-8c43-0f0215c18109-kube-api-access-lk58c\") pod \"controller-manager-879f6c89f-cqcmp\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.959363 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.961732 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.971818 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df8md\" (UniqueName: \"kubernetes.io/projected/d10230a4-235f-4c7d-9057-cc613fab04fc-kube-api-access-df8md\") pod \"openshift-apiserver-operator-796bbdcf4f-lt6lk\" (UID: \"d10230a4-235f-4c7d-9057-cc613fab04fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.972345 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-webhook-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.973039 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.973740 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/410302c2-c1e9-4b31-8edb-a62078470e7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.979432 4957 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.980988 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.986957 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:53:56 crc kubenswrapper[4957]: I0123 10:53:56.992404 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.001189 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.020720 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.062923 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.110904 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.112078 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.121507 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.131568 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.144768 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.159420 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.163201 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.179808 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.197765 4957 request.go:700] Waited for 1.899270179s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.202395 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.218821 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.240125 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.276258 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k28ws\" (UniqueName: \"kubernetes.io/projected/7183c961-6fa6-49c8-909a-7defa10a655a-kube-api-access-k28ws\") pod \"cluster-samples-operator-665b6dd947-7t6gz\" (UID: \"7183c961-6fa6-49c8-909a-7defa10a655a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.291113 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b657ce2-3516-4ac1-9bdb-a6fc97c19b31-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fkz4f\" (UID: \"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.317301 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fb7\" (UniqueName: \"kubernetes.io/projected/fe95d360-2a00-47e5-b577-817575b85417-kube-api-access-t5fb7\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.334424 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhct5\" (UniqueName: \"kubernetes.io/projected/c210711d-5840-4b58-948c-e9d19f041ee2-kube-api-access-vhct5\") pod \"openshift-controller-manager-operator-756b6f6bc6-7cppv\" (UID: \"c210711d-5840-4b58-948c-e9d19f041ee2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.338500 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.350974 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv67w\" (UniqueName: \"kubernetes.io/projected/f0f6f781-8789-47c0-badf-5f4a9dc36621-kube-api-access-sv67w\") pod \"authentication-operator-69f744f599-d6nt4\" (UID: \"f0f6f781-8789-47c0-badf-5f4a9dc36621\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.368146 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.371586 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.373217 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6b2q\" (UniqueName: \"kubernetes.io/projected/da48191c-513e-4069-b4b7-8f6de3363326-kube-api-access-m6b2q\") pod \"etcd-operator-b45778765-2z4v2\" (UID: \"da48191c-513e-4069-b4b7-8f6de3363326\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.378033 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.391816 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klbjf\" (UniqueName: \"kubernetes.io/projected/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-kube-api-access-klbjf\") pod \"oauth-openshift-558db77b4-wtmvm\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.399640 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.412712 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g647f\" (UniqueName: \"kubernetes.io/projected/214c7e14-5663-41c3-8a75-4573dff48b63-kube-api-access-g647f\") pod \"machine-approver-56656f9798-jh66z\" (UID: \"214c7e14-5663-41c3-8a75-4573dff48b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.434392 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.459218 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzvwp\" (UniqueName: \"kubernetes.io/projected/9af57495-ed2b-4a03-8206-b26948dfa61a-kube-api-access-bzvwp\") pod \"console-operator-58897d9998-bcjxk\" (UID: \"9af57495-ed2b-4a03-8206-b26948dfa61a\") " pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.475869 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kxxj\" (UniqueName: \"kubernetes.io/projected/808262b6-32ca-42f1-938e-f009dac6b1db-kube-api-access-9kxxj\") pod \"dns-operator-744455d44c-rfzwr\" (UID: \"808262b6-32ca-42f1-938e-f009dac6b1db\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.495061 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe95d360-2a00-47e5-b577-817575b85417-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h2fvd\" (UID: \"fe95d360-2a00-47e5-b577-817575b85417\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.501534 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lrnc4"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.514346 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.516502 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.521072 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/6e16c2a0-8698-450c-b5d9-78ab6c26a2b3-kube-api-access-t5bmn\") pod \"ingress-operator-5b745b69d9-wdgd4\" (UID: \"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:57 crc kubenswrapper[4957]: W0123 10:53:57.526502 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-db7f9dd2b9f802d433f0053bb71af4bbd6d8623f4f67fc6a374663861563ccaf WatchSource:0}: Error finding container db7f9dd2b9f802d433f0053bb71af4bbd6d8623f4f67fc6a374663861563ccaf: Status 404 returned error can't find the container with id db7f9dd2b9f802d433f0053bb71af4bbd6d8623f4f67fc6a374663861563ccaf Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.528126 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d94z6"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.537576 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqcmp"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.546347 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65vxk\" (UniqueName: \"kubernetes.io/projected/beac68ed-ac4c-424e-aaff-79a4f7f246d0-kube-api-access-65vxk\") pod \"router-default-5444994796-6brn9\" (UID: \"beac68ed-ac4c-424e-aaff-79a4f7f246d0\") " pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.555523 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96sbr\" (UniqueName: \"kubernetes.io/projected/410302c2-c1e9-4b31-8edb-a62078470e7f-kube-api-access-96sbr\") pod \"packageserver-d55dfcdfc-z6lpq\" (UID: \"410302c2-c1e9-4b31-8edb-a62078470e7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.572061 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwsx2\" (UniqueName: \"kubernetes.io/projected/d140b4dc-6d8e-4940-9a60-aa98665ac1b2-kube-api-access-lwsx2\") pod \"csi-hostpathplugin-z8lhm\" (UID: \"d140b4dc-6d8e-4940-9a60-aa98665ac1b2\") " pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.582345 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.596596 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.606934 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn8vq\" (UniqueName: \"kubernetes.io/projected/12c22d2e-17e6-4c9a-83f0-24225d15d476-kube-api-access-jn8vq\") pod \"machine-config-operator-74547568cd-tnjn2\" (UID: \"12c22d2e-17e6-4c9a-83f0-24225d15d476\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.625890 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-frdsx"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.626892 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.627730 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj4sm\" (UniqueName: \"kubernetes.io/projected/9f5fe703-fd89-4007-ae49-96ae0202d69c-kube-api-access-zj4sm\") pod \"kube-storage-version-migrator-operator-b67b599dd-grskq\" (UID: \"9f5fe703-fd89-4007-ae49-96ae0202d69c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.631716 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.638061 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d6nt4"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.641877 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh5c\" (UniqueName: \"kubernetes.io/projected/5cdf7d3f-3edb-4741-a232-1f4d969417ce-kube-api-access-7bh5c\") pod \"service-ca-9c57cc56f-8p9zs\" (UID: \"5cdf7d3f-3edb-4741-a232-1f4d969417ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.647313 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.648316 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5t9v6"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.649733 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.657825 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" Jan 23 10:53:57 crc kubenswrapper[4957]: W0123 10:53:57.666747 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf918b8f5_e2ac_439f_b9b6_8a2e85dd4ba3.slice/crio-6a688a6934f0bb185dbfc3cb264e59889daba0c9aca129a2adde95ecdca6dd35 WatchSource:0}: Error finding container 6a688a6934f0bb185dbfc3cb264e59889daba0c9aca129a2adde95ecdca6dd35: Status 404 returned error can't find the container with id 6a688a6934f0bb185dbfc3cb264e59889daba0c9aca129a2adde95ecdca6dd35 Jan 23 10:53:57 crc kubenswrapper[4957]: W0123 10:53:57.668729 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd10230a4_235f_4c7d_9057_cc613fab04fc.slice/crio-56f91b9dc45aa9c107857a000f5f67fef0a4cca711223caadb77ace4b2dbde68 WatchSource:0}: Error finding container 56f91b9dc45aa9c107857a000f5f67fef0a4cca711223caadb77ace4b2dbde68: Status 404 returned error can't find the container with id 56f91b9dc45aa9c107857a000f5f67fef0a4cca711223caadb77ace4b2dbde68 Jan 23 10:53:57 crc kubenswrapper[4957]: W0123 10:53:57.669484 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f6f781_8789_47c0_badf_5f4a9dc36621.slice/crio-0ae372a76d6c9aae448685c04ea8f270615e07107b6a8f55b2b4e3e9d05cf84b WatchSource:0}: Error finding container 0ae372a76d6c9aae448685c04ea8f270615e07107b6a8f55b2b4e3e9d05cf84b: Status 404 returned error can't find the container with id 0ae372a76d6c9aae448685c04ea8f270615e07107b6a8f55b2b4e3e9d05cf84b Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.675710 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czhvg\" (UniqueName: \"kubernetes.io/projected/5b14648d-48f6-42d6-8bd3-91ddcc54bfbc-kube-api-access-czhvg\") pod \"multus-admission-controller-857f4d67dd-2zqvt\" (UID: \"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.676914 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c810606c-cfa3-4391-bbc2-7e6ff647393c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x94pd\" (UID: \"c810606c-cfa3-4391-bbc2-7e6ff647393c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.689856 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.694928 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6tvk\" (UniqueName: \"kubernetes.io/projected/6cd38561-c9ad-4248-adfe-b62f67cf4221-kube-api-access-d6tvk\") pod \"service-ca-operator-777779d784-wt5d2\" (UID: \"6cd38561-c9ad-4248-adfe-b62f67cf4221\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.705725 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.717138 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkn8\" (UniqueName: \"kubernetes.io/projected/aad2f022-0b03-41aa-a16d-000844e34eed-kube-api-access-tvkn8\") pod \"catalog-operator-68c6474976-kvtws\" (UID: \"aad2f022-0b03-41aa-a16d-000844e34eed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.731231 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.738406 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnqq\" (UniqueName: \"kubernetes.io/projected/7d5670ad-73cc-493e-a54f-e684a6f00f06-kube-api-access-9rnqq\") pod \"package-server-manager-789f6589d5-jp7cg\" (UID: \"7d5670ad-73cc-493e-a54f-e684a6f00f06\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.741993 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.754224 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:53:57 crc kubenswrapper[4957]: W0123 10:53:57.759737 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod214c7e14_5663_41c3_8a75_4573dff48b63.slice/crio-7bd6bf516ea7321e3da0082a7467a4584bd36864bef51dc3551f360c88546912 WatchSource:0}: Error finding container 7bd6bf516ea7321e3da0082a7467a4584bd36864bef51dc3551f360c88546912: Status 404 returned error can't find the container with id 7bd6bf516ea7321e3da0082a7467a4584bd36864bef51dc3551f360c88546912 Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.767360 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.781559 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.790839 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.805254 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.830044 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.840625 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848162 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmqcg\" (UniqueName: \"kubernetes.io/projected/f597fef0-1fc5-4826-a130-dd45792b308d-kube-api-access-bmqcg\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848670 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lncns\" (UniqueName: \"kubernetes.io/projected/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-kube-api-access-lncns\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848704 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-certificates\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848734 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-trusted-ca\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848757 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-bound-sa-token\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848789 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/933ef718-6583-4b78-b8ca-79cfab4926ee-config\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848845 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8dec53f-51c9-4e3d-b111-45152d5b0c71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848904 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbr8\" (UniqueName: \"kubernetes.io/projected/04b86548-903a-4a0b-bb5f-c9cd297a9047-kube-api-access-9lbr8\") pod \"control-plane-machine-set-operator-78cbb6b69f-xs8l7\" (UID: \"04b86548-903a-4a0b-bb5f-c9cd297a9047\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.848982 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f597fef0-1fc5-4826-a130-dd45792b308d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849019 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgcwn\" (UniqueName: \"kubernetes.io/projected/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-kube-api-access-qgcwn\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849044 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/04b86548-903a-4a0b-bb5f-c9cd297a9047-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xs8l7\" (UID: \"04b86548-903a-4a0b-bb5f-c9cd297a9047\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849081 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/933ef718-6583-4b78-b8ca-79cfab4926ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849114 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/933ef718-6583-4b78-b8ca-79cfab4926ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849138 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-secret-volume\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849164 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc9g4\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-kube-api-access-nc9g4\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849187 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/96879d14-19e3-4044-a2ea-163a66fe801a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849348 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8dec53f-51c9-4e3d-b111-45152d5b0c71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849536 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnnwv\" (UniqueName: \"kubernetes.io/projected/96879d14-19e3-4044-a2ea-163a66fe801a-kube-api-access-nnnwv\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849689 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849786 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-config-volume\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849850 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849882 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/96879d14-19e3-4044-a2ea-163a66fe801a-srv-cert\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849911 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r2nq\" (UniqueName: \"kubernetes.io/projected/6e07d3a9-d361-4fb4-a795-0849abadd0f0-kube-api-access-6r2nq\") pod \"migrator-59844c95c7-nnxst\" (UID: \"6e07d3a9-d361-4fb4-a795-0849abadd0f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849950 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f597fef0-1fc5-4826-a130-dd45792b308d-proxy-tls\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.849995 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-tls\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.850028 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: E0123 10:53:57.850309 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.350274912 +0000 UTC m=+147.887527599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.879170 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2z4v2"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.881262 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.882189 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.912536 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.921866 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.946846 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wtmvm"] Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951176 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951447 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/04b22862-334a-4deb-b327-10e17c568e99-node-bootstrap-token\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951496 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/933ef718-6583-4b78-b8ca-79cfab4926ee-config\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: E0123 10:53:57.951546 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.451505355 +0000 UTC m=+147.988758052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951599 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8dec53f-51c9-4e3d-b111-45152d5b0c71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951686 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbr8\" (UniqueName: \"kubernetes.io/projected/04b86548-903a-4a0b-bb5f-c9cd297a9047-kube-api-access-9lbr8\") pod \"control-plane-machine-set-operator-78cbb6b69f-xs8l7\" (UID: \"04b86548-903a-4a0b-bb5f-c9cd297a9047\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951781 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8dsn\" (UniqueName: \"kubernetes.io/projected/a03b75a6-72e3-45a2-a6aa-95f65af54796-kube-api-access-x8dsn\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951807 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f597fef0-1fc5-4826-a130-dd45792b308d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951842 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgcwn\" (UniqueName: \"kubernetes.io/projected/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-kube-api-access-qgcwn\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.951866 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/04b86548-903a-4a0b-bb5f-c9cd297a9047-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xs8l7\" (UID: \"04b86548-903a-4a0b-bb5f-c9cd297a9047\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.952266 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/933ef718-6583-4b78-b8ca-79cfab4926ee-config\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953010 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/933ef718-6583-4b78-b8ca-79cfab4926ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953066 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf54g\" (UniqueName: \"kubernetes.io/projected/04b22862-334a-4deb-b327-10e17c568e99-kube-api-access-nf54g\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953088 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a03b75a6-72e3-45a2-a6aa-95f65af54796-config-volume\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953124 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/933ef718-6583-4b78-b8ca-79cfab4926ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953152 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-secret-volume\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953214 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc9g4\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-kube-api-access-nc9g4\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953256 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/96879d14-19e3-4044-a2ea-163a66fe801a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953315 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8dec53f-51c9-4e3d-b111-45152d5b0c71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953363 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9897961-d77f-441d-9ed1-94a1ea26860e-cert\") pod \"ingress-canary-scpzq\" (UID: \"b9897961-d77f-441d-9ed1-94a1ea26860e\") " pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953436 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnnwv\" (UniqueName: \"kubernetes.io/projected/96879d14-19e3-4044-a2ea-163a66fe801a-kube-api-access-nnnwv\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953480 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953586 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-config-volume\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953628 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953655 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/96879d14-19e3-4044-a2ea-163a66fe801a-srv-cert\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953690 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r2nq\" (UniqueName: \"kubernetes.io/projected/6e07d3a9-d361-4fb4-a795-0849abadd0f0-kube-api-access-6r2nq\") pod \"migrator-59844c95c7-nnxst\" (UID: \"6e07d3a9-d361-4fb4-a795-0849abadd0f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953790 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f597fef0-1fc5-4826-a130-dd45792b308d-proxy-tls\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953794 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f597fef0-1fc5-4826-a130-dd45792b308d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953907 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-tls\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953936 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a03b75a6-72e3-45a2-a6aa-95f65af54796-metrics-tls\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.953973 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.954036 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmqcg\" (UniqueName: \"kubernetes.io/projected/f597fef0-1fc5-4826-a130-dd45792b308d-kube-api-access-bmqcg\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.954067 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lncns\" (UniqueName: \"kubernetes.io/projected/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-kube-api-access-lncns\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.954144 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjmgr\" (UniqueName: \"kubernetes.io/projected/b9897961-d77f-441d-9ed1-94a1ea26860e-kube-api-access-fjmgr\") pod \"ingress-canary-scpzq\" (UID: \"b9897961-d77f-441d-9ed1-94a1ea26860e\") " pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.954181 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-certificates\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.954242 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-trusted-ca\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.954266 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-bound-sa-token\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.954308 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/04b22862-334a-4deb-b327-10e17c568e99-certs\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:57 crc kubenswrapper[4957]: E0123 10:53:57.954735 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.454719859 +0000 UTC m=+147.991972636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.955432 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8dec53f-51c9-4e3d-b111-45152d5b0c71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.959090 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/04b86548-903a-4a0b-bb5f-c9cd297a9047-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xs8l7\" (UID: \"04b86548-903a-4a0b-bb5f-c9cd297a9047\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.962996 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8dec53f-51c9-4e3d-b111-45152d5b0c71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.965192 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-certificates\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.969182 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f597fef0-1fc5-4826-a130-dd45792b308d-proxy-tls\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.970672 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/96879d14-19e3-4044-a2ea-163a66fe801a-srv-cert\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.972183 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.972952 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/96879d14-19e3-4044-a2ea-163a66fe801a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.974050 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/933ef718-6583-4b78-b8ca-79cfab4926ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.974416 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-config-volume\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.974616 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-tls\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.977814 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.979248 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-secret-volume\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.982009 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-trusted-ca\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:57 crc kubenswrapper[4957]: I0123 10:53:57.996408 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbr8\" (UniqueName: \"kubernetes.io/projected/04b86548-903a-4a0b-bb5f-c9cd297a9047-kube-api-access-9lbr8\") pod \"control-plane-machine-set-operator-78cbb6b69f-xs8l7\" (UID: \"04b86548-903a-4a0b-bb5f-c9cd297a9047\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.036447 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgcwn\" (UniqueName: \"kubernetes.io/projected/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-kube-api-access-qgcwn\") pod \"marketplace-operator-79b997595-7cfk5\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.054778 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.054961 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8dsn\" (UniqueName: \"kubernetes.io/projected/a03b75a6-72e3-45a2-a6aa-95f65af54796-kube-api-access-x8dsn\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.054999 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf54g\" (UniqueName: \"kubernetes.io/projected/04b22862-334a-4deb-b327-10e17c568e99-kube-api-access-nf54g\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.055015 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a03b75a6-72e3-45a2-a6aa-95f65af54796-config-volume\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.055048 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9897961-d77f-441d-9ed1-94a1ea26860e-cert\") pod \"ingress-canary-scpzq\" (UID: \"b9897961-d77f-441d-9ed1-94a1ea26860e\") " pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.055117 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a03b75a6-72e3-45a2-a6aa-95f65af54796-metrics-tls\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.055157 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjmgr\" (UniqueName: \"kubernetes.io/projected/b9897961-d77f-441d-9ed1-94a1ea26860e-kube-api-access-fjmgr\") pod \"ingress-canary-scpzq\" (UID: \"b9897961-d77f-441d-9ed1-94a1ea26860e\") " pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.055185 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/04b22862-334a-4deb-b327-10e17c568e99-certs\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.055212 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/04b22862-334a-4deb-b327-10e17c568e99-node-bootstrap-token\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.055291 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.555254614 +0000 UTC m=+148.092507311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.058990 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frdsx" event={"ID":"b721a986-d921-41f6-ba96-47647e168858","Type":"ContainerStarted","Data":"036e35e6dff969ed9e21fdec8696dc06e87c57363c7d093e80dfac108f68855f"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.063579 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" event={"ID":"6bb12583-2035-41bb-8847-fd46198f7ede","Type":"ContainerStarted","Data":"e43de41c4681237fb45ff13b662563f6cca7e5e908ea6724fee3f4e2a6364d8d"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.067623 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a03b75a6-72e3-45a2-a6aa-95f65af54796-metrics-tls\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.069883 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9897961-d77f-441d-9ed1-94a1ea26860e-cert\") pod \"ingress-canary-scpzq\" (UID: \"b9897961-d77f-441d-9ed1-94a1ea26860e\") " pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.070508 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r2nq\" (UniqueName: \"kubernetes.io/projected/6e07d3a9-d361-4fb4-a795-0849abadd0f0-kube-api-access-6r2nq\") pod \"migrator-59844c95c7-nnxst\" (UID: \"6e07d3a9-d361-4fb4-a795-0849abadd0f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.070791 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a03b75a6-72e3-45a2-a6aa-95f65af54796-config-volume\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.094037 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rfzwr"] Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.094894 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/933ef718-6583-4b78-b8ca-79cfab4926ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5hzp\" (UID: \"933ef718-6583-4b78-b8ca-79cfab4926ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.096357 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/04b22862-334a-4deb-b327-10e17c568e99-certs\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.096573 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.103481 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/04b22862-334a-4deb-b327-10e17c568e99-node-bootstrap-token\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.103702 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" event={"ID":"f0f6f781-8789-47c0-badf-5f4a9dc36621","Type":"ContainerStarted","Data":"0ae372a76d6c9aae448685c04ea8f270615e07107b6a8f55b2b4e3e9d05cf84b"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.113900 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.123027 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc9g4\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-kube-api-access-nc9g4\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.125154 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmqcg\" (UniqueName: \"kubernetes.io/projected/f597fef0-1fc5-4826-a130-dd45792b308d-kube-api-access-bmqcg\") pod \"machine-config-controller-84d6567774-92gbq\" (UID: \"f597fef0-1fc5-4826-a130-dd45792b308d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.152862 4957 generic.go:334] "Generic (PLEG): container finished" podID="a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5" containerID="00ec9562612ac60c99c90ee357eadb1875d2716a575d0c499dc4c1df4c93b5e4" exitCode=0 Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.152949 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" event={"ID":"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5","Type":"ContainerDied","Data":"00ec9562612ac60c99c90ee357eadb1875d2716a575d0c499dc4c1df4c93b5e4"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.152979 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" event={"ID":"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5","Type":"ContainerStarted","Data":"0f1748ef625eacf7293da0d47d5fde15c9a45840b2ce4cffaec82bb02c06da76"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.153531 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lncns\" (UniqueName: \"kubernetes.io/projected/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-kube-api-access-lncns\") pod \"collect-profiles-29486085-pm58b\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.156786 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.157524 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.657511174 +0000 UTC m=+148.194763861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.165225 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnnwv\" (UniqueName: \"kubernetes.io/projected/96879d14-19e3-4044-a2ea-163a66fe801a-kube-api-access-nnnwv\") pod \"olm-operator-6b444d44fb-7dpwm\" (UID: \"96879d14-19e3-4044-a2ea-163a66fe801a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.169819 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.178297 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d94z6" event={"ID":"92ace232-9107-480d-a4cd-27d7bd114efd","Type":"ContainerStarted","Data":"f74f8069ace0111ed293572788bd4ed92a7b922153cad64d73261f8e48740619"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.178341 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d94z6" event={"ID":"92ace232-9107-480d-a4cd-27d7bd114efd","Type":"ContainerStarted","Data":"a10ef2a6f762abe037b65b5f169be80640196725753b9b455cab05084022b269"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.179031 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d94z6" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.179372 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-bound-sa-token\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.183508 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" event={"ID":"d10230a4-235f-4c7d-9057-cc613fab04fc","Type":"ContainerStarted","Data":"56f91b9dc45aa9c107857a000f5f67fef0a4cca711223caadb77ace4b2dbde68"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.186217 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f5c3e9a39f90ed69ede4e6222a4ada00b3daaf242ac8f58bc97aa707dbf92ed9"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.186247 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"963a5f4b797fe757c84d375684b206739b01c528e6f9aca2baa5541d6ca1a347"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.188394 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z8lhm"] Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.193563 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.194557 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8dsn\" (UniqueName: \"kubernetes.io/projected/a03b75a6-72e3-45a2-a6aa-95f65af54796-kube-api-access-x8dsn\") pod \"dns-default-rxncn\" (UID: \"a03b75a6-72e3-45a2-a6aa-95f65af54796\") " pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.196218 4957 patch_prober.go:28] interesting pod/downloads-7954f5f757-d94z6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.196246 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d94z6" podUID="92ace232-9107-480d-a4cd-27d7bd114efd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.196570 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.198561 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6b5743b4ebe6141544bd6f6d1a9ce8fb7cf5f7a794615418e50b2ed9b314fd71"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.198610 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"db7f9dd2b9f802d433f0053bb71af4bbd6d8623f4f67fc6a374663861563ccaf"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.202652 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.212409 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" event={"ID":"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3","Type":"ContainerStarted","Data":"6a688a6934f0bb185dbfc3cb264e59889daba0c9aca129a2adde95ecdca6dd35"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.214196 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" event={"ID":"7183c961-6fa6-49c8-909a-7defa10a655a","Type":"ContainerStarted","Data":"e6564ab9375a32322d55f97c67821edd80dcd814743f1f4eab06f21a44d318a8"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.219619 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjmgr\" (UniqueName: \"kubernetes.io/projected/b9897961-d77f-441d-9ed1-94a1ea26860e-kube-api-access-fjmgr\") pod \"ingress-canary-scpzq\" (UID: \"b9897961-d77f-441d-9ed1-94a1ea26860e\") " pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.226000 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" event={"ID":"214c7e14-5663-41c3-8a75-4573dff48b63","Type":"ContainerStarted","Data":"7bd6bf516ea7321e3da0082a7467a4584bd36864bef51dc3551f360c88546912"} Jan 23 10:53:58 crc kubenswrapper[4957]: W0123 10:53:58.226796 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod808262b6_32ca_42f1_938e_f009dac6b1db.slice/crio-2d6908bd1d1b732427e872106217a16036e3ebb6c922f3751755edf926dab231 WatchSource:0}: Error finding container 2d6908bd1d1b732427e872106217a16036e3ebb6c922f3751755edf926dab231: Status 404 returned error can't find the container with id 2d6908bd1d1b732427e872106217a16036e3ebb6c922f3751755edf926dab231 Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.236925 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" event={"ID":"c210711d-5840-4b58-948c-e9d19f041ee2","Type":"ContainerStarted","Data":"69650acde8b12b422057b5102678d1d1922e1c00f50875fe9285f1601872a63c"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.238300 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf54g\" (UniqueName: \"kubernetes.io/projected/04b22862-334a-4deb-b327-10e17c568e99-kube-api-access-nf54g\") pod \"machine-config-server-n7rd5\" (UID: \"04b22862-334a-4deb-b327-10e17c568e99\") " pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.251899 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" event={"ID":"da48191c-513e-4069-b4b7-8f6de3363326","Type":"ContainerStarted","Data":"3c4e55e3794d397a6e111616ec8a99a7c228d264075cfde4b9da7e2bcbd0a5ed"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.257905 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.258972 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.758954484 +0000 UTC m=+148.296207171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.262667 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq"] Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.264340 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n7rd5" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.268222 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3a266a74e459062f57e2a075a33532c7f7d8f2a639b64cf17a2361add928d809"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.268975 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.272820 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" event={"ID":"4238c53f-acdc-409f-8d1d-e4608fe5c239","Type":"ContainerStarted","Data":"022ba5077af66b884bec95ccd63b568338e452e55c01395c872fa099bfe2bd1e"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.272852 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" event={"ID":"4238c53f-acdc-409f-8d1d-e4608fe5c239","Type":"ContainerStarted","Data":"5e2917f2d395a6c739cbaca092e3d10b8e379530c29ed8d896098f2c178122f8"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.275684 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" event={"ID":"ea6adf64-ce45-4965-8c43-0f0215c18109","Type":"ContainerStarted","Data":"40448eee897d34ffcafc921680edc9433eade05f4f35601a9217be7e41f2d194"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.276833 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.278802 4957 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cqcmp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.278846 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" podUID="ea6adf64-ce45-4965-8c43-0f0215c18109" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.290788 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" event={"ID":"f6bd9bee-ee36-4280-948c-f39f88acaa70","Type":"ContainerStarted","Data":"2e0f477e169be6359210eeffdf587282cce727f254e63542448a9708973adcee"} Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.291627 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.296547 4957 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-z7qr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.296599 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" podUID="f6bd9bee-ee36-4280-948c-f39f88acaa70" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.297246 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-scpzq" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.310513 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rxncn" Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.312744 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4"] Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.358871 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.359891 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.859873309 +0000 UTC m=+148.397125996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.360545 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" Jan 23 10:53:58 crc kubenswrapper[4957]: W0123 10:53:58.399586 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeac68ed_ac4c_424e_aaff_79a4f7f246d0.slice/crio-e4237de94c01f2791abbf5ed915ceceea2cd9db33667b5882df0c05ebcca97ec WatchSource:0}: Error finding container e4237de94c01f2791abbf5ed915ceceea2cd9db33667b5882df0c05ebcca97ec: Status 404 returned error can't find the container with id e4237de94c01f2791abbf5ed915ceceea2cd9db33667b5882df0c05ebcca97ec Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.464386 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.466478 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.966445841 +0000 UTC m=+148.503698528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.468006 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.468337 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:58.96832896 +0000 UTC m=+148.505581647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.569686 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.570570 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.070551369 +0000 UTC m=+148.607804056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.671224 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.671731 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.171719911 +0000 UTC m=+148.708972598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.728621 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2zqvt"] Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.729930 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bcjxk"] Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.772727 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.774309 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.274265839 +0000 UTC m=+148.811518516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.789361 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq"] Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.875198 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.875842 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.375831511 +0000 UTC m=+148.913084198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:58 crc kubenswrapper[4957]: I0123 10:53:58.977384 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:58 crc kubenswrapper[4957]: E0123 10:53:58.977680 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.4776648 +0000 UTC m=+149.014917487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.084926 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.085227 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.585217198 +0000 UTC m=+149.122469885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.117345 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.137116 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.187023 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.187393 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.187480 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.687457748 +0000 UTC m=+149.224710435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.188341 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.188801 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.688787633 +0000 UTC m=+149.226040320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.289294 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.289972 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.789950124 +0000 UTC m=+149.327202811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.298514 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.352166 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8p9zs"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.373421 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.386576 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.391585 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.392760 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.892738248 +0000 UTC m=+149.429990935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.463587 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6brn9" event={"ID":"beac68ed-ac4c-424e-aaff-79a4f7f246d0","Type":"ContainerStarted","Data":"f0721a30c389e16daff490ac3bc3326b045b5c764a79697e5266684917ed1502"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.463649 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6brn9" event={"ID":"beac68ed-ac4c-424e-aaff-79a4f7f246d0","Type":"ContainerStarted","Data":"e4237de94c01f2791abbf5ed915ceceea2cd9db33667b5882df0c05ebcca97ec"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.472238 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" event={"ID":"d140b4dc-6d8e-4940-9a60-aa98665ac1b2","Type":"ContainerStarted","Data":"b920da434ccbbb635d322efb9d8d84b1c8f4ac4ecdd37b8bc3834f91ea2a2334"} Jan 23 10:53:59 crc kubenswrapper[4957]: W0123 10:53:59.475939 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc810606c_cfa3_4391_bbc2_7e6ff647393c.slice/crio-2e82b21c96d6c9564c601e9f6ed89a51c76bdbf02b7f7ab696678b16085372b3 WatchSource:0}: Error finding container 2e82b21c96d6c9564c601e9f6ed89a51c76bdbf02b7f7ab696678b16085372b3: Status 404 returned error can't find the container with id 2e82b21c96d6c9564c601e9f6ed89a51c76bdbf02b7f7ab696678b16085372b3 Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.486311 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" event={"ID":"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc","Type":"ContainerStarted","Data":"92055c28675eb8b5d2bef46f2553c0c2b0e0d9d295bf99071028133dc3ffd28c"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.491821 4957 generic.go:334] "Generic (PLEG): container finished" podID="f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3" containerID="11e3bcbd917c48f04b1e5e70269bf7773486ecbff7acb818f599b9874b3a43de" exitCode=0 Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.491936 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" event={"ID":"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3","Type":"ContainerDied","Data":"11e3bcbd917c48f04b1e5e70269bf7773486ecbff7acb818f599b9874b3a43de"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.493036 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.493475 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.993450458 +0000 UTC m=+149.530703145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.493541 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.494056 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:53:59.994011973 +0000 UTC m=+149.531264670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.598255 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.598504 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.09846646 +0000 UTC m=+149.635719147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.599076 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.599730 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.099709522 +0000 UTC m=+149.636962209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.679379 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" event={"ID":"214c7e14-5663-41c3-8a75-4573dff48b63","Type":"ContainerStarted","Data":"2464daf1269ce8842fae51d7115a1b9723154d007ac1c92bc69c01217259ed2b"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.701633 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.702580 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.202561428 +0000 UTC m=+149.739814125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.708934 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" event={"ID":"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d","Type":"ContainerStarted","Data":"0aa13813f27e6f34a353f611b50a2b5b75dbd5c175051469e0ec01cea287a791"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.724566 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" podStartSLOduration=129.724535039 podStartE2EDuration="2m9.724535039s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:59.694223941 +0000 UTC m=+149.231476648" watchObservedRunningTime="2026-01-23 10:53:59.724535039 +0000 UTC m=+149.261787726" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.726610 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7"] Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.748123 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bcjxk" event={"ID":"9af57495-ed2b-4a03-8206-b26948dfa61a","Type":"ContainerStarted","Data":"ea2de05779305a6165122f5ebf859f95b7ed7b4d5bcb43f55ccf0619ec0f037a"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.771889 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" podStartSLOduration=130.77186041 podStartE2EDuration="2m10.77186041s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:59.748584945 +0000 UTC m=+149.285837632" watchObservedRunningTime="2026-01-23 10:53:59.77186041 +0000 UTC m=+149.309113097" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.772846 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.800529 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" event={"ID":"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31","Type":"ContainerStarted","Data":"6973811f0887e2691584d94ac88f0e5e97c27c8cbebc9e058757e7f401e23df0"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.811970 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.845217 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.345197028 +0000 UTC m=+149.882449715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.846929 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n7rd5" event={"ID":"04b22862-334a-4deb-b327-10e17c568e99","Type":"ContainerStarted","Data":"83ae45c5a009d75ade8766874e7ad05cbad7814c2342a482170d305d70a1f6e6"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.858222 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6brn9" podStartSLOduration=129.858110315 podStartE2EDuration="2m9.858110315s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:59.845842205 +0000 UTC m=+149.383094892" watchObservedRunningTime="2026-01-23 10:53:59.858110315 +0000 UTC m=+149.395363002" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.858332 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" event={"ID":"808262b6-32ca-42f1-938e-f009dac6b1db","Type":"ContainerStarted","Data":"2d6908bd1d1b732427e872106217a16036e3ebb6c922f3751755edf926dab231"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.868237 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-d94z6" podStartSLOduration=130.868220307 podStartE2EDuration="2m10.868220307s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:53:59.867474978 +0000 UTC m=+149.404727665" watchObservedRunningTime="2026-01-23 10:53:59.868220307 +0000 UTC m=+149.405472994" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.909696 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" event={"ID":"410302c2-c1e9-4b31-8edb-a62078470e7f","Type":"ContainerStarted","Data":"1e40cd0286cdc4bc3df55253700cc402fdf4eef6ebefbdcf2a907d3189ed0295"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.910070 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.921617 4957 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z6lpq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.921694 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" podUID="410302c2-c1e9-4b31-8edb-a62078470e7f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.945795 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:53:59 crc kubenswrapper[4957]: E0123 10:53:59.946258 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.446239967 +0000 UTC m=+149.983492654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.963161 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"90cccd7723aed701e41c51422e458e8d13b0c4cf5d76710ed6b4ddb9fbab00d3"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.969723 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:53:59 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:53:59 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:53:59 crc kubenswrapper[4957]: healthz check failed Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.969776 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.972788 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" event={"ID":"9f5fe703-fd89-4007-ae49-96ae0202d69c","Type":"ContainerStarted","Data":"33f5a0c3d4fc779cdfebd45024acff1e5c6b3084c851346da7bdcef781cd9586"} Jan 23 10:53:59 crc kubenswrapper[4957]: I0123 10:53:59.991840 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" event={"ID":"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3","Type":"ContainerStarted","Data":"435b3f695a246a5ae1a3e84d58d5dff1d8252b32b99ccef39b7a10a294ed8e3c"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.009475 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" podStartSLOduration=130.009456402 podStartE2EDuration="2m10.009456402s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:00.008368593 +0000 UTC m=+149.545621290" watchObservedRunningTime="2026-01-23 10:54:00.009456402 +0000 UTC m=+149.546709089" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.017082 4957 generic.go:334] "Generic (PLEG): container finished" podID="6bb12583-2035-41bb-8847-fd46198f7ede" containerID="3979854b24317f573eec3ee7d537cec75ae12c7dc9e8cf5cb7d048c871b636e9" exitCode=0 Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.017226 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" event={"ID":"6bb12583-2035-41bb-8847-fd46198f7ede","Type":"ContainerDied","Data":"3979854b24317f573eec3ee7d537cec75ae12c7dc9e8cf5cb7d048c871b636e9"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.036798 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frdsx" event={"ID":"b721a986-d921-41f6-ba96-47647e168858","Type":"ContainerStarted","Data":"22d2df05bf87d53d74c9c81a4cac255000d5042d355b53b63ced7e18573d2702"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.046991 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.048444 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.548431496 +0000 UTC m=+150.085684183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.061053 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" podStartSLOduration=130.061029993 podStartE2EDuration="2m10.061029993s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:00.054604566 +0000 UTC m=+149.591857253" watchObservedRunningTime="2026-01-23 10:54:00.061029993 +0000 UTC m=+149.598282680" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.139655 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-frdsx" podStartSLOduration=131.139632968 podStartE2EDuration="2m11.139632968s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:00.084173935 +0000 UTC m=+149.621426622" watchObservedRunningTime="2026-01-23 10:54:00.139632968 +0000 UTC m=+149.676885655" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.144569 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7cppv" event={"ID":"c210711d-5840-4b58-948c-e9d19f041ee2","Type":"ContainerStarted","Data":"5d86254afee20a51fbba6a2d7f7638b1c9a82359c98a18178ad4bf622158164d"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.148139 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.152479 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.652435861 +0000 UTC m=+150.189688558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.162925 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" event={"ID":"d10230a4-235f-4c7d-9057-cc613fab04fc","Type":"ContainerStarted","Data":"080c2b44fd0d87061ef30f28781232bf4fbd640e5e158ad66bc4ac86e3591d84"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.187941 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" event={"ID":"f6bd9bee-ee36-4280-948c-f39f88acaa70","Type":"ContainerStarted","Data":"372478e4e90cdb2660fa7b912f5f3461e60b2ae637861911896bf3ef647f70e7"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.220161 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.229830 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" event={"ID":"ea6adf64-ce45-4965-8c43-0f0215c18109","Type":"ContainerStarted","Data":"47d214baa4f3aea0bf8b97cf2de29bf328345dc34ec9b06c97867d6d4108ea34"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.242214 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.255024 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.255447 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.75543411 +0000 UTC m=+150.292686797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.284877 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lt6lk" podStartSLOduration=131.284856715 podStartE2EDuration="2m11.284856715s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:00.193478658 +0000 UTC m=+149.730731345" watchObservedRunningTime="2026-01-23 10:54:00.284856715 +0000 UTC m=+149.822109402" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.288925 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-scpzq"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.302696 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.323487 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" event={"ID":"f0f6f781-8789-47c0-badf-5f4a9dc36621","Type":"ContainerStarted","Data":"3cc706b1aea58dd37c043d1b59822f8b3b9fd3ad9c2eeda1eee6ddd5feddf463"} Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.358665 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.360191 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.860174565 +0000 UTC m=+150.397427252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.399613 4957 patch_prober.go:28] interesting pod/downloads-7954f5f757-d94z6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.399671 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d94z6" podUID="92ace232-9107-480d-a4cd-27d7bd114efd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.439314 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" podStartSLOduration=131.439298464 podStartE2EDuration="2m11.439298464s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:00.429808837 +0000 UTC m=+149.967061524" watchObservedRunningTime="2026-01-23 10:54:00.439298464 +0000 UTC m=+149.976551151" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.442808 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-d6nt4" podStartSLOduration=131.442787234 podStartE2EDuration="2m11.442787234s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:00.375193116 +0000 UTC m=+149.912445803" watchObservedRunningTime="2026-01-23 10:54:00.442787234 +0000 UTC m=+149.980039921" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.458254 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.464020 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.465548 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:00.965534355 +0000 UTC m=+150.502787042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.500750 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cfk5"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.522362 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.526327 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rxncn"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.529054 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.531804 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm"] Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.567229 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.567925 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.06791083 +0000 UTC m=+150.605163517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.668539 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.668913 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.168896827 +0000 UTC m=+150.706149514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.775421 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.775669 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.275655264 +0000 UTC m=+150.812907951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.776643 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:00 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:00 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:00 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.776718 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.876091 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.876429 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.376418575 +0000 UTC m=+150.913671262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.979543 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.979615 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.47959884 +0000 UTC m=+151.016851527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:00 crc kubenswrapper[4957]: I0123 10:54:00.979989 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:00 crc kubenswrapper[4957]: E0123 10:54:00.980309 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.480301697 +0000 UTC m=+151.017554384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.082060 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.082370 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.582356923 +0000 UTC m=+151.119609610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.183108 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.183428 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.683415741 +0000 UTC m=+151.220668428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.287043 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.787028167 +0000 UTC m=+151.324280854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.286962 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.289619 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.290220 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.790205649 +0000 UTC m=+151.327458336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.390780 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.391021 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.891000351 +0000 UTC m=+151.428253038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.391122 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.391447 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.891437523 +0000 UTC m=+151.428690220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.431427 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rxncn" event={"ID":"a03b75a6-72e3-45a2-a6aa-95f65af54796","Type":"ContainerStarted","Data":"8244122c5d7c2d193d7c5c46f0f56f01492213c96f69c0008dc66fbde9d28f59"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.449129 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" event={"ID":"12c22d2e-17e6-4c9a-83f0-24225d15d476","Type":"ContainerStarted","Data":"ba9773377d474fc4e44cf95333a76efdb86a0d869649ac23724b9c19a7e91d0a"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.449177 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" event={"ID":"12c22d2e-17e6-4c9a-83f0-24225d15d476","Type":"ContainerStarted","Data":"f234c1426a5c212a2c09d59f85e11bbebe16db9933cf399d8560aea76544954b"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.452564 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" event={"ID":"933ef718-6583-4b78-b8ca-79cfab4926ee","Type":"ContainerStarted","Data":"98d7229e728bd290cfeabdb23c95500f67a6525021ecf8b521e6052c6c257086"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.456663 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" event={"ID":"a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5","Type":"ContainerStarted","Data":"9c69521e0b6adc7942864df907c54228b024374086f733b028abed5693584b0a"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.457464 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.461584 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" event={"ID":"6e07d3a9-d361-4fb4-a795-0849abadd0f0","Type":"ContainerStarted","Data":"75ef21964ef1a8cbdff8d8f0197d7dfda0a8af86de34a9f5439a30e1c9791a6b"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.494874 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.499466 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:01.999437693 +0000 UTC m=+151.536690390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.513535 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" event={"ID":"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d","Type":"ContainerStarted","Data":"c2b3953fa767fab028d1ccb214f96f563a30df59f1b0977cb8145ea38261e6d7"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.514606 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.533180 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" event={"ID":"757f7f5f-b2fc-4fa0-b59b-63ad3548a307","Type":"ContainerStarted","Data":"01fbe8a78f7bb32fbfe3016b603be8726970d9abe318cffafdbd39f3382e2270"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.556252 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" podStartSLOduration=132.55623657 podStartE2EDuration="2m12.55623657s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.55584955 +0000 UTC m=+151.093102237" watchObservedRunningTime="2026-01-23 10:54:01.55623657 +0000 UTC m=+151.093489257" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.556711 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" podStartSLOduration=132.556706413 podStartE2EDuration="2m12.556706413s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.499509965 +0000 UTC m=+151.036762652" watchObservedRunningTime="2026-01-23 10:54:01.556706413 +0000 UTC m=+151.093959100" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.564507 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" event={"ID":"7d5670ad-73cc-493e-a54f-e684a6f00f06","Type":"ContainerStarted","Data":"7f3d5aaf8489d5896f2962e0a1b5fe103a6384331cdf907d1e8d7daf6f7c65ca"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.564556 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" event={"ID":"7d5670ad-73cc-493e-a54f-e684a6f00f06","Type":"ContainerStarted","Data":"42bcf6e4b3000b49edd287b9ed65df6c16bd78441b90a23f659cbd47a5a7ab89"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.582202 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" event={"ID":"214c7e14-5663-41c3-8a75-4573dff48b63","Type":"ContainerStarted","Data":"7ba915b7a8c7070c9aacd634a0d310995fa320519e9f669f34e43259636e7a42"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.592010 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" event={"ID":"04b86548-903a-4a0b-bb5f-c9cd297a9047","Type":"ContainerStarted","Data":"3ae5c80d91cdfad935fc0c79112cb05d0182e644e44662cb9700a097148e4c56"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.603051 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.604323 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.104307311 +0000 UTC m=+151.641559988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.606043 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" event={"ID":"96879d14-19e3-4044-a2ea-163a66fe801a","Type":"ContainerStarted","Data":"3b55f6336060dedf927bd771cabfd504ef843a2c683ae56e97a4baa8e33adf41"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.690711 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" event={"ID":"4238c53f-acdc-409f-8d1d-e4608fe5c239","Type":"ContainerStarted","Data":"a664dd4400995bbf4a8d2b56c62b50c2fc88a10588d40624c6c3be85f2fdd7e1"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.705129 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.705711 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.205693458 +0000 UTC m=+151.742946145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.725977 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jh66z" podStartSLOduration=132.725960456 podStartE2EDuration="2m12.725960456s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.617423882 +0000 UTC m=+151.154676589" watchObservedRunningTime="2026-01-23 10:54:01.725960456 +0000 UTC m=+151.263213143" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.728173 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrnc4" podStartSLOduration=131.728167013 podStartE2EDuration="2m11.728167013s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.725592035 +0000 UTC m=+151.262844722" watchObservedRunningTime="2026-01-23 10:54:01.728167013 +0000 UTC m=+151.265419700" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.735169 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n7rd5" event={"ID":"04b22862-334a-4deb-b327-10e17c568e99","Type":"ContainerStarted","Data":"f529bf62c85782f775bd0b16888c08c85368a84a53b619e54ba4d8e3934d43b3"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.776743 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-n7rd5" podStartSLOduration=6.776723196 podStartE2EDuration="6.776723196s" podCreationTimestamp="2026-01-23 10:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.774150289 +0000 UTC m=+151.311402976" watchObservedRunningTime="2026-01-23 10:54:01.776723196 +0000 UTC m=+151.313975883" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.781621 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" event={"ID":"aad2f022-0b03-41aa-a16d-000844e34eed","Type":"ContainerStarted","Data":"fc2467660f8ec2ea5e193ffb6b2c27d11f21b3a19c4f141a51ca192120d318ac"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.782195 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.777060 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:01 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:01 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:01 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.782586 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.795934 4957 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-kvtws container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.796092 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" podUID="aad2f022-0b03-41aa-a16d-000844e34eed" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.801484 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" event={"ID":"da48191c-513e-4069-b4b7-8f6de3363326","Type":"ContainerStarted","Data":"c8b5c9e59ebc138f8d224e1759574e61591d51ca412c9ac9b84e6a081b9c2a65"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.814829 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.817169 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.317157878 +0000 UTC m=+151.854410565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.826196 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" podStartSLOduration=131.826179272 podStartE2EDuration="2m11.826179272s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.81453715 +0000 UTC m=+151.351789837" watchObservedRunningTime="2026-01-23 10:54:01.826179272 +0000 UTC m=+151.363431959" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.833949 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" event={"ID":"9f5fe703-fd89-4007-ae49-96ae0202d69c","Type":"ContainerStarted","Data":"764510fbdb5e63b2acd0183244c63095bf3ca03aabf1770c75826f2368504f14"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.838919 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" event={"ID":"7183c961-6fa6-49c8-909a-7defa10a655a","Type":"ContainerStarted","Data":"94e7d02f9b913706ea81d5b9647db23725f57f6953cad230356642c799eb9e7f"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.838959 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7t6gz" event={"ID":"7183c961-6fa6-49c8-909a-7defa10a655a","Type":"ContainerStarted","Data":"a6ad72fb1c91521ef3a5593e64f60c57c41fe6bc0f6d8ec6ecc4749f3fbeb5d5"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.854028 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.857836 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bcjxk" event={"ID":"9af57495-ed2b-4a03-8206-b26948dfa61a","Type":"ContainerStarted","Data":"66668a557df2bc3ea6cf02b200d674fa7d95fdf4f89997a6909430e85a27435e"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.858119 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.875350 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" event={"ID":"6cd38561-c9ad-4248-adfe-b62f67cf4221","Type":"ContainerStarted","Data":"849ddec485d60c32ae9274c3adf6afe437c9defca812140b710d81e4e2e74c6f"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.875392 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" event={"ID":"6cd38561-c9ad-4248-adfe-b62f67cf4221","Type":"ContainerStarted","Data":"7b0cdbc198a63c329276cc3f0e09f4d8604f5f9376a5e8fb2eba3b7265df66f5"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.885594 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" event={"ID":"c951bf2d-f3fe-4a75-8e95-040c46cb1f01","Type":"ContainerStarted","Data":"5d8dba33398399fc101e8bbfab78cbb211bd0bcbd60f97616bce85d72e438a66"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.892827 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" event={"ID":"808262b6-32ca-42f1-938e-f009dac6b1db","Type":"ContainerStarted","Data":"b4a7e973c92e00744af6bce044fbdd2e4a702f1d7b8d87491e3dad724486d454"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.915128 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-scpzq" event={"ID":"b9897961-d77f-441d-9ed1-94a1ea26860e","Type":"ContainerStarted","Data":"4a345b1d11ecb3f9ddf22f1b0714f8395c908ad6c62d846fc86a2a35c6733a88"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.915616 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:01 crc kubenswrapper[4957]: E0123 10:54:01.916926 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.416907293 +0000 UTC m=+151.954159980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.929099 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bcjxk" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.931098 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2z4v2" podStartSLOduration=131.931082301 podStartE2EDuration="2m11.931082301s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.874961702 +0000 UTC m=+151.412214389" watchObservedRunningTime="2026-01-23 10:54:01.931082301 +0000 UTC m=+151.468335008" Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.961434 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" event={"ID":"d140b4dc-6d8e-4940-9a60-aa98665ac1b2","Type":"ContainerStarted","Data":"87909d0e3cc2dd0bf4343b228a98f6722d06e848086a5f4edd5e8cebf6e63b05"} Jan 23 10:54:01 crc kubenswrapper[4957]: I0123 10:54:01.991745 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-grskq" podStartSLOduration=131.9917239 podStartE2EDuration="2m11.9917239s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:01.99021381 +0000 UTC m=+151.527466497" watchObservedRunningTime="2026-01-23 10:54:01.9917239 +0000 UTC m=+151.528976587" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.012068 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" event={"ID":"f597fef0-1fc5-4826-a130-dd45792b308d","Type":"ContainerStarted","Data":"234a94c50ef89f4d597b562d54f02950b3c15461be57aef59477db1f558f90a6"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.017896 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.018658 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.51863577 +0000 UTC m=+152.055888457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.022795 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" event={"ID":"5cdf7d3f-3edb-4741-a232-1f4d969417ce","Type":"ContainerStarted","Data":"4abba19583626118dfbfe89a0041d1388351b19e81ac300bc2f3924804e6c174"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.043117 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" event={"ID":"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3","Type":"ContainerStarted","Data":"7a77ecbc5e79e464a7b59970ecd41ab077374fb7d494cd1344ee60e3049f4754"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.111801 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" event={"ID":"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc","Type":"ContainerStarted","Data":"eb2b783b17fa7fb19c50f329c4f128ec022613db04427d13b282fdc3ee6e9e4e"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.119137 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.119450 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.619419661 +0000 UTC m=+152.156672348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.119687 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.121298 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.621267509 +0000 UTC m=+152.158520196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.136565 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" event={"ID":"fe95d360-2a00-47e5-b577-817575b85417","Type":"ContainerStarted","Data":"dc5083c8910a0f758c2c94d85730712001baf7b694418b0675984a5f6fbcafc2"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.136620 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" event={"ID":"fe95d360-2a00-47e5-b577-817575b85417","Type":"ContainerStarted","Data":"6e07b8d9c8d5650a3f545790658dbd1fa6c85500d5b819e3bf304d76baa75944"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.140029 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" event={"ID":"5b657ce2-3516-4ac1-9bdb-a6fc97c19b31","Type":"ContainerStarted","Data":"a43c5a2a8daa9763f4f1f502c2fde06a6d83b56ac427d778dbd641f3f71b68e2"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.176854 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bcjxk" podStartSLOduration=133.176834865 podStartE2EDuration="2m13.176834865s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:02.112605544 +0000 UTC m=+151.649858231" watchObservedRunningTime="2026-01-23 10:54:02.176834865 +0000 UTC m=+151.714087552" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.201611 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" event={"ID":"410302c2-c1e9-4b31-8edb-a62078470e7f","Type":"ContainerStarted","Data":"0e21c3388bb8579ce04ed1de26a1b2ac191eadf8103c27cd5a0f90a02a08177e"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.223415 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.223797 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.723783366 +0000 UTC m=+152.261036043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.225727 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" event={"ID":"c810606c-cfa3-4391-bbc2-7e6ff647393c","Type":"ContainerStarted","Data":"fd4e6e0b750ee2ecdba79ca0725f75ab2ab8172db8bfc9930095854f92079386"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.225768 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" event={"ID":"c810606c-cfa3-4391-bbc2-7e6ff647393c","Type":"ContainerStarted","Data":"2e82b21c96d6c9564c601e9f6ed89a51c76bdbf02b7f7ab696678b16085372b3"} Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.225825 4957 patch_prober.go:28] interesting pod/downloads-7954f5f757-d94z6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.225858 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d94z6" podUID="92ace232-9107-480d-a4cd-27d7bd114efd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.233461 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6lpq" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.247193 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" podStartSLOduration=132.247177435 podStartE2EDuration="2m12.247177435s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:02.177522383 +0000 UTC m=+151.714775060" watchObservedRunningTime="2026-01-23 10:54:02.247177435 +0000 UTC m=+151.784430122" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.248738 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt5d2" podStartSLOduration=132.248733875 podStartE2EDuration="2m12.248733875s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:02.247633217 +0000 UTC m=+151.784885904" watchObservedRunningTime="2026-01-23 10:54:02.248733875 +0000 UTC m=+151.785986562" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.333549 4957 csr.go:261] certificate signing request csr-mllm2 is approved, waiting to be issued Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.334796 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.341609 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.841595731 +0000 UTC m=+152.378848418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.360935 4957 csr.go:257] certificate signing request csr-mllm2 is issued Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.402148 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h2fvd" podStartSLOduration=132.402130826 podStartE2EDuration="2m12.402130826s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:02.400978786 +0000 UTC m=+151.938231473" watchObservedRunningTime="2026-01-23 10:54:02.402130826 +0000 UTC m=+151.939383513" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.420208 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fkz4f" podStartSLOduration=132.420183996 podStartE2EDuration="2m12.420183996s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:02.377620388 +0000 UTC m=+151.914873085" watchObservedRunningTime="2026-01-23 10:54:02.420183996 +0000 UTC m=+151.957436683" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.435903 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.436217 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:02.936204722 +0000 UTC m=+152.473457409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.441614 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x94pd" podStartSLOduration=132.441601393 podStartE2EDuration="2m12.441601393s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:02.439699393 +0000 UTC m=+151.976952080" watchObservedRunningTime="2026-01-23 10:54:02.441601393 +0000 UTC m=+151.978854080" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.540157 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.540477 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.040466024 +0000 UTC m=+152.577718711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.641233 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.641521 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.141492333 +0000 UTC m=+152.678745030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.641751 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.642114 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.142100279 +0000 UTC m=+152.679352966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.743085 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.743437 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.243422604 +0000 UTC m=+152.780675291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.773256 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:02 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:02 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:02 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.773322 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.844786 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.845067 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.345054628 +0000 UTC m=+152.882307315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.903911 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.945766 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.945974 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.445948343 +0000 UTC m=+152.983201030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:02 crc kubenswrapper[4957]: I0123 10:54:02.946245 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:02 crc kubenswrapper[4957]: E0123 10:54:02.946575 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.446561589 +0000 UTC m=+152.983814276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.047417 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.047618 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.547595088 +0000 UTC m=+153.084847775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.047661 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.047989 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.547981658 +0000 UTC m=+153.085234345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.148991 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.149187 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.649155719 +0000 UTC m=+153.186408406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.149565 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.149865 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.649841798 +0000 UTC m=+153.187094485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.235877 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rxncn" event={"ID":"a03b75a6-72e3-45a2-a6aa-95f65af54796","Type":"ContainerStarted","Data":"cf39e06f1763a30657f2f6d765e4539f0f39057048e8f3ec962cfb345938e493"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.236124 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rxncn" event={"ID":"a03b75a6-72e3-45a2-a6aa-95f65af54796","Type":"ContainerStarted","Data":"d3702040952a2e03b4859153dd9a1f9f1e60360d9c18ccfa1e25550a00a11883"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.237671 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" event={"ID":"808262b6-32ca-42f1-938e-f009dac6b1db","Type":"ContainerStarted","Data":"57f146d983bc6872d640a724c50c35e3269e149510000a023985b72f3dec06f4"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.239507 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" event={"ID":"757f7f5f-b2fc-4fa0-b59b-63ad3548a307","Type":"ContainerStarted","Data":"ff8eb8517c84a585621502b458219b0c717c05e997760eb657da815ea20aa026"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.241490 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" event={"ID":"f918b8f5-e2ac-439f-b9b6-8a2e85dd4ba3","Type":"ContainerStarted","Data":"e292208916255bdafcc38dc85d680835c842a5fb0d18424c7844b4a725ca24f7"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.244028 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" event={"ID":"5b14648d-48f6-42d6-8bd3-91ddcc54bfbc","Type":"ContainerStarted","Data":"9a0f3c0aa427f28816078e81acdb90a9e654a3ace6e86a334e22d312d397ff5a"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.246448 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" event={"ID":"f597fef0-1fc5-4826-a130-dd45792b308d","Type":"ContainerStarted","Data":"bae075947d8c5c08a7a02baadcba1780440c58765ecd46a94b0a6559d7ae1009"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.246487 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" event={"ID":"f597fef0-1fc5-4826-a130-dd45792b308d","Type":"ContainerStarted","Data":"6c058db4cd8ee4fa1690208931a6a332b176c49970f1e664b4d388b6aff090be"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.250094 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.250319 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.750263649 +0000 UTC m=+153.287516336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.250802 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.251082 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.751075301 +0000 UTC m=+153.288327978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.252463 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" event={"ID":"7d5670ad-73cc-493e-a54f-e684a6f00f06","Type":"ContainerStarted","Data":"9d902915bb93695d05861b8ce924045dc696c7601b5ca69f6c0702b11ce81ea5"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.252595 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.263692 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rfzwr" podStartSLOduration=133.263674349 podStartE2EDuration="2m13.263674349s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.261161793 +0000 UTC m=+152.798414490" watchObservedRunningTime="2026-01-23 10:54:03.263674349 +0000 UTC m=+152.800927036" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.263727 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" event={"ID":"6bb12583-2035-41bb-8847-fd46198f7ede","Type":"ContainerStarted","Data":"36e87e6e44f2006b45a97677d94303957032b6c42f670d3e28a7ae4bc2a1d255"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.263764 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" event={"ID":"6bb12583-2035-41bb-8847-fd46198f7ede","Type":"ContainerStarted","Data":"0cf2df477e55e3556da2cb6b29c176e9b3960f1a09e6743126ba6b8f27b32afb"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.265199 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" event={"ID":"96879d14-19e3-4044-a2ea-163a66fe801a","Type":"ContainerStarted","Data":"d7160caf20dbced7660e0c0331c4c13ddc9664e37f81b647bc9452bdd314bdbc"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.266028 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.268286 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" event={"ID":"c951bf2d-f3fe-4a75-8e95-040c46cb1f01","Type":"ContainerStarted","Data":"5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.268567 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.271829 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.272349 4957 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7cfk5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.272421 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" podUID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.272820 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" event={"ID":"6e16c2a0-8698-450c-b5d9-78ab6c26a2b3","Type":"ContainerStarted","Data":"00724cd8865af3693e96c6a74a3a06eb2e5c451ca7f337ee4d694eb439d45113"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.278541 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" event={"ID":"933ef718-6583-4b78-b8ca-79cfab4926ee","Type":"ContainerStarted","Data":"d83cb2c1cb5748200a02dd35504232fd76cf42f7f7bd6e6c7c10944b0864de40"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.280633 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" event={"ID":"04b86548-903a-4a0b-bb5f-c9cd297a9047","Type":"ContainerStarted","Data":"ca445539d237ed2aecf421f9465b012486955110d227c97758dbffdb0e66d68b"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.282354 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" event={"ID":"aad2f022-0b03-41aa-a16d-000844e34eed","Type":"ContainerStarted","Data":"f363b70172176a51936c42f4a5fee89248cbee60645ecf0d8dde63ca6ca4e9a5"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.284466 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8p9zs" event={"ID":"5cdf7d3f-3edb-4741-a232-1f4d969417ce","Type":"ContainerStarted","Data":"3098a66193f17cfbb269f405f9d648a04eefebdc66a812b19e2f5c7159d16dcf"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.286810 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" event={"ID":"12c22d2e-17e6-4c9a-83f0-24225d15d476","Type":"ContainerStarted","Data":"e6f7749d9aad7f8e067713bbeefce709676b1692ae4eda82539f9428df39e028"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.288518 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-scpzq" event={"ID":"b9897961-d77f-441d-9ed1-94a1ea26860e","Type":"ContainerStarted","Data":"0486fd6ed901be01b1bf1aa7f1d2af4b060a735ac639b7c6fea4980289a0d9d7"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.294938 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" podStartSLOduration=133.294923011 podStartE2EDuration="2m13.294923011s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.293013822 +0000 UTC m=+152.830266509" watchObservedRunningTime="2026-01-23 10:54:03.294923011 +0000 UTC m=+152.832175698" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.306520 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kvtws" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.309466 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" event={"ID":"6e07d3a9-d361-4fb4-a795-0849abadd0f0","Type":"ContainerStarted","Data":"9c027b19c75000873b1b71a7b68349647b5bd2c2a13c2a0988259cc3eccdeccc"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.309527 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" event={"ID":"6e07d3a9-d361-4fb4-a795-0849abadd0f0","Type":"ContainerStarted","Data":"0989e89470f7902a0cb62e346e8e6a8484531589f682f6fc24f9b9a7adba8cb9"} Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.348317 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" podStartSLOduration=133.348293089 podStartE2EDuration="2m13.348293089s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.340115487 +0000 UTC m=+152.877368174" watchObservedRunningTime="2026-01-23 10:54:03.348293089 +0000 UTC m=+152.885545776" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.353928 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.354347 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.854328117 +0000 UTC m=+153.391580804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.362417 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-23 10:49:02 +0000 UTC, rotation deadline is 2026-10-27 04:29:38.00544714 +0000 UTC Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.362453 4957 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6641h35m34.642996411s for next certificate rotation Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.396682 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" podStartSLOduration=134.396664658 podStartE2EDuration="2m14.396664658s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.373758573 +0000 UTC m=+152.911011260" watchObservedRunningTime="2026-01-23 10:54:03.396664658 +0000 UTC m=+152.933917345" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.420772 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-92gbq" podStartSLOduration=133.420756545 podStartE2EDuration="2m13.420756545s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.397452838 +0000 UTC m=+152.934705535" watchObservedRunningTime="2026-01-23 10:54:03.420756545 +0000 UTC m=+152.958009232" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.452609 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zqvt" podStartSLOduration=133.452594683 podStartE2EDuration="2m13.452594683s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.421000141 +0000 UTC m=+152.958252828" watchObservedRunningTime="2026-01-23 10:54:03.452594683 +0000 UTC m=+152.989847370" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.456310 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.466428 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:03.966388302 +0000 UTC m=+153.503640989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.494978 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xs8l7" podStartSLOduration=133.494959705 podStartE2EDuration="2m13.494959705s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.492679467 +0000 UTC m=+153.029932154" watchObservedRunningTime="2026-01-23 10:54:03.494959705 +0000 UTC m=+153.032212402" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.496446 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7dpwm" podStartSLOduration=133.496440204 podStartE2EDuration="2m13.496440204s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.455347325 +0000 UTC m=+152.992600012" watchObservedRunningTime="2026-01-23 10:54:03.496440204 +0000 UTC m=+153.033692901" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.558725 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.559302 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.059273068 +0000 UTC m=+153.596525755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.572995 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wdgd4" podStartSLOduration=133.572979345 podStartE2EDuration="2m13.572979345s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.536490296 +0000 UTC m=+153.073742983" watchObservedRunningTime="2026-01-23 10:54:03.572979345 +0000 UTC m=+153.110232032" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.603050 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnjn2" podStartSLOduration=133.603034987 podStartE2EDuration="2m13.603034987s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.573806606 +0000 UTC m=+153.111059293" watchObservedRunningTime="2026-01-23 10:54:03.603034987 +0000 UTC m=+153.140287674" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.613605 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-scpzq" podStartSLOduration=8.613584031 podStartE2EDuration="8.613584031s" podCreationTimestamp="2026-01-23 10:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.605553772 +0000 UTC m=+153.142806459" watchObservedRunningTime="2026-01-23 10:54:03.613584031 +0000 UTC m=+153.150836718" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.636706 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5hzp" podStartSLOduration=133.636689782 podStartE2EDuration="2m13.636689782s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.635438999 +0000 UTC m=+153.172691686" watchObservedRunningTime="2026-01-23 10:54:03.636689782 +0000 UTC m=+153.173942469" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.659983 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.660338 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.160326377 +0000 UTC m=+153.697579064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.701622 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nnxst" podStartSLOduration=133.701603771 podStartE2EDuration="2m13.701603771s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.679341222 +0000 UTC m=+153.216593979" watchObservedRunningTime="2026-01-23 10:54:03.701603771 +0000 UTC m=+153.238856458" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.701878 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" podStartSLOduration=134.701874378 podStartE2EDuration="2m14.701874378s" podCreationTimestamp="2026-01-23 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.70081297 +0000 UTC m=+153.238065657" watchObservedRunningTime="2026-01-23 10:54:03.701874378 +0000 UTC m=+153.239127065" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.717716 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" podStartSLOduration=133.71769973 podStartE2EDuration="2m13.71769973s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:03.717630908 +0000 UTC m=+153.254883585" watchObservedRunningTime="2026-01-23 10:54:03.71769973 +0000 UTC m=+153.254952417" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.756884 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5zrcr"] Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.758239 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.760649 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.762622 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.762913 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.262899416 +0000 UTC m=+153.800152103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.774056 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zrcr"] Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.774630 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:03 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:03 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:03 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.774701 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.864469 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-catalog-content\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.864766 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-utilities\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.864899 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.865022 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5k4m\" (UniqueName: \"kubernetes.io/projected/fd5862ba-aabd-4c6f-b546-3f22d40592b8-kube-api-access-g5k4m\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.865289 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.365260199 +0000 UTC m=+153.902512886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.957620 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l7xvl"] Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.958479 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.960724 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.965600 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.965812 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5k4m\" (UniqueName: \"kubernetes.io/projected/fd5862ba-aabd-4c6f-b546-3f22d40592b8-kube-api-access-g5k4m\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.965850 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-catalog-content\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: E0123 10:54:03.965907 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.465888096 +0000 UTC m=+154.003140783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.966057 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-utilities\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.966226 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-catalog-content\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.966442 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-utilities\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.980674 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7xvl"] Jan 23 10:54:03 crc kubenswrapper[4957]: I0123 10:54:03.994711 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5k4m\" (UniqueName: \"kubernetes.io/projected/fd5862ba-aabd-4c6f-b546-3f22d40592b8-kube-api-access-g5k4m\") pod \"certified-operators-5zrcr\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.044816 4957 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.067964 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-catalog-content\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.068246 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-utilities\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.068408 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.068750 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.568738542 +0000 UTC m=+154.105991219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.069038 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnnbf\" (UniqueName: \"kubernetes.io/projected/4210122d-91b9-4890-a8d0-23e71c42d121-kube-api-access-xnnbf\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.082043 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.174064 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tbq4h"] Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.177341 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.177424 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.677407629 +0000 UTC m=+154.214660316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.180084 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-utilities\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.180207 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.180307 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnnbf\" (UniqueName: \"kubernetes.io/projected/4210122d-91b9-4890-a8d0-23e71c42d121-kube-api-access-xnnbf\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.180410 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-catalog-content\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.180675 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-utilities\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.180967 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-catalog-content\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.181189 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.681133746 +0000 UTC m=+154.218386433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.183668 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.202460 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbq4h"] Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.233339 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnnbf\" (UniqueName: \"kubernetes.io/projected/4210122d-91b9-4890-a8d0-23e71c42d121-kube-api-access-xnnbf\") pod \"community-operators-l7xvl\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.292849 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.292976 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.792957295 +0000 UTC m=+154.330209982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.293245 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-catalog-content\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.293333 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-utilities\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.293357 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.293398 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsspd\" (UniqueName: \"kubernetes.io/projected/6e09bef4-a911-4771-b0e4-233eba62eddf-kube-api-access-hsspd\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.293660 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.793652523 +0000 UTC m=+154.330905210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.336324 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" event={"ID":"d140b4dc-6d8e-4940-9a60-aa98665ac1b2","Type":"ContainerStarted","Data":"832b2530065f9403f3ea7e639ef382b23802c7caaf0b2f0637788f6a8e6c2689"} Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.337666 4957 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7cfk5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.337704 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" podUID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.353680 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.362175 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-87khr"] Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.368017 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.374035 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rxncn" podStartSLOduration=9.374020064 podStartE2EDuration="9.374020064s" podCreationTimestamp="2026-01-23 10:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:04.371764955 +0000 UTC m=+153.909017642" watchObservedRunningTime="2026-01-23 10:54:04.374020064 +0000 UTC m=+153.911272751" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.393160 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87khr"] Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.394205 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.394782 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.894764824 +0000 UTC m=+154.432017501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.399785 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsspd\" (UniqueName: \"kubernetes.io/projected/6e09bef4-a911-4771-b0e4-233eba62eddf-kube-api-access-hsspd\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.399943 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-catalog-content\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.403048 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-utilities\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.403078 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.407413 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-catalog-content\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.414228 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-utilities\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.415608 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:04.915594446 +0000 UTC m=+154.452847133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.424137 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsspd\" (UniqueName: \"kubernetes.io/projected/6e09bef4-a911-4771-b0e4-233eba62eddf-kube-api-access-hsspd\") pod \"certified-operators-tbq4h\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.504895 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.505169 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwt4b\" (UniqueName: \"kubernetes.io/projected/24be1c19-2466-4e1e-9232-9697534b5d6e-kube-api-access-nwt4b\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.505205 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-utilities\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.505222 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-catalog-content\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.505346 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:05.00533079 +0000 UTC m=+154.542583477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.526188 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zrcr"] Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.532018 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.606265 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-utilities\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.606316 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-catalog-content\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.606363 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.606423 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwt4b\" (UniqueName: \"kubernetes.io/projected/24be1c19-2466-4e1e-9232-9697534b5d6e-kube-api-access-nwt4b\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.607266 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-utilities\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.607508 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-catalog-content\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.607769 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:05.107758244 +0000 UTC m=+154.645010931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.626062 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwt4b\" (UniqueName: \"kubernetes.io/projected/24be1c19-2466-4e1e-9232-9697534b5d6e-kube-api-access-nwt4b\") pod \"community-operators-87khr\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.695485 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.707334 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.707598 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:05.207584601 +0000 UTC m=+154.744837288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.774176 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:04 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:04 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:04 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.774475 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.808971 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.809268 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-23 10:54:05.309254366 +0000 UTC m=+154.846507053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hgc64" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.858401 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbq4h"] Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.909696 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:04 crc kubenswrapper[4957]: E0123 10:54:04.910058 4957 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-23 10:54:05.410044738 +0000 UTC m=+154.947297415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.910340 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7xvl"] Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.977434 4957 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-23T10:54:04.04483864Z","Handler":null,"Name":""} Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.987180 4957 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.987208 4957 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 23 10:54:04 crc kubenswrapper[4957]: I0123 10:54:04.999761 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87khr"] Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.010934 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.013529 4957 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.013571 4957 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.054993 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hgc64\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.112389 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.128351 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.277169 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.353005 4957 generic.go:334] "Generic (PLEG): container finished" podID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerID="d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891" exitCode=0 Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.353936 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbq4h" event={"ID":"6e09bef4-a911-4771-b0e4-233eba62eddf","Type":"ContainerDied","Data":"d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.353987 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbq4h" event={"ID":"6e09bef4-a911-4771-b0e4-233eba62eddf","Type":"ContainerStarted","Data":"9c749650d5ed6a26e9eb1613757b6b2dae74cde7e52ef3aabf13e0a278c0ef47"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.355886 4957 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.359941 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" event={"ID":"d140b4dc-6d8e-4940-9a60-aa98665ac1b2","Type":"ContainerStarted","Data":"2a430d62340103cc973ef1807dc98d63519beb3ea34ccad38f7c5c612b01022d"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.359968 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" event={"ID":"d140b4dc-6d8e-4940-9a60-aa98665ac1b2","Type":"ContainerStarted","Data":"c9ca023799777365a7fffc48ff16e71108d962d646677ae10ff709062d7e28b3"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.382796 4957 generic.go:334] "Generic (PLEG): container finished" podID="757f7f5f-b2fc-4fa0-b59b-63ad3548a307" containerID="ff8eb8517c84a585621502b458219b0c717c05e997760eb657da815ea20aa026" exitCode=0 Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.383054 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" event={"ID":"757f7f5f-b2fc-4fa0-b59b-63ad3548a307","Type":"ContainerDied","Data":"ff8eb8517c84a585621502b458219b0c717c05e997760eb657da815ea20aa026"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.386374 4957 generic.go:334] "Generic (PLEG): container finished" podID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerID="7615e0bd7b82a55e8ca5b1a82d1b3a24dde36d0d806f689bec9bf3a9ea3cf7e9" exitCode=0 Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.386428 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87khr" event={"ID":"24be1c19-2466-4e1e-9232-9697534b5d6e","Type":"ContainerDied","Data":"7615e0bd7b82a55e8ca5b1a82d1b3a24dde36d0d806f689bec9bf3a9ea3cf7e9"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.386450 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87khr" event={"ID":"24be1c19-2466-4e1e-9232-9697534b5d6e","Type":"ContainerStarted","Data":"bf3669c35d73fe0cc3020369027d3ad209c6dda235e4a23a24cb1cc0aa9447ef"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.388650 4957 generic.go:334] "Generic (PLEG): container finished" podID="4210122d-91b9-4890-a8d0-23e71c42d121" containerID="b11dcc9ce8fd4e67119d96225fda2b94e6a9eec65922e063ac911a339fe96636" exitCode=0 Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.389089 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7xvl" event={"ID":"4210122d-91b9-4890-a8d0-23e71c42d121","Type":"ContainerDied","Data":"b11dcc9ce8fd4e67119d96225fda2b94e6a9eec65922e063ac911a339fe96636"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.389112 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7xvl" event={"ID":"4210122d-91b9-4890-a8d0-23e71c42d121","Type":"ContainerStarted","Data":"67e6131a830a5fad0d0cb1fb6f718d2306bc46a15bb47312ef6d13759ceb1548"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.390961 4957 generic.go:334] "Generic (PLEG): container finished" podID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerID="5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88" exitCode=0 Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.391430 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrcr" event={"ID":"fd5862ba-aabd-4c6f-b546-3f22d40592b8","Type":"ContainerDied","Data":"5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.391455 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrcr" event={"ID":"fd5862ba-aabd-4c6f-b546-3f22d40592b8","Type":"ContainerStarted","Data":"b8fdf4baae8de0f9c652d6977b171226b4344032580c44c1b4d15d2101ad5a59"} Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.399603 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" podStartSLOduration=11.399583713 podStartE2EDuration="11.399583713s" podCreationTimestamp="2026-01-23 10:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:05.397069238 +0000 UTC m=+154.934321985" watchObservedRunningTime="2026-01-23 10:54:05.399583713 +0000 UTC m=+154.936836400" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.486597 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hgc64"] Jan 23 10:54:05 crc kubenswrapper[4957]: W0123 10:54:05.493306 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8dec53f_51c9_4e3d_b111_45152d5b0c71.slice/crio-fe27ac06f9ec11060c6e98190f122fef33d38c7771103781850cf7a5dc35db47 WatchSource:0}: Error finding container fe27ac06f9ec11060c6e98190f122fef33d38c7771103781850cf7a5dc35db47: Status 404 returned error can't find the container with id fe27ac06f9ec11060c6e98190f122fef33d38c7771103781850cf7a5dc35db47 Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.669539 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.671729 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.671899 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.674486 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.681594 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.724502 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.724675 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.771223 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:05 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:05 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:05 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.771317 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.825774 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.825903 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.826209 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.858347 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.966500 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2c452"] Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.967718 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.969175 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.977908 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c452"] Jan 23 10:54:05 crc kubenswrapper[4957]: I0123 10:54:05.994271 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.033867 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xnr\" (UniqueName: \"kubernetes.io/projected/783873e0-bce9-4f05-849d-0fe265010d39-kube-api-access-m4xnr\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.033954 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-utilities\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.034024 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-catalog-content\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.134829 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-catalog-content\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.134894 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xnr\" (UniqueName: \"kubernetes.io/projected/783873e0-bce9-4f05-849d-0fe265010d39-kube-api-access-m4xnr\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.134951 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-utilities\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.135775 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-utilities\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.136026 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-catalog-content\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.194034 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xnr\" (UniqueName: \"kubernetes.io/projected/783873e0-bce9-4f05-849d-0fe265010d39-kube-api-access-m4xnr\") pod \"redhat-marketplace-2c452\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.285197 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.343156 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.349585 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jhgc"] Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.361600 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.367674 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jhgc"] Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.405094 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" event={"ID":"e8dec53f-51c9-4e3d-b111-45152d5b0c71","Type":"ContainerStarted","Data":"865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b"} Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.405153 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" event={"ID":"e8dec53f-51c9-4e3d-b111-45152d5b0c71","Type":"ContainerStarted","Data":"fe27ac06f9ec11060c6e98190f122fef33d38c7771103781850cf7a5dc35db47"} Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.405322 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.408333 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"89d5d26c-8d52-47a1-ad35-7b23eecf210e","Type":"ContainerStarted","Data":"1dd65bd4e9aa3eaf7de7c9c5433e5d2a371fbc65791fd1da5977506ee3e0c5c6"} Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.444024 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hfkc\" (UniqueName: \"kubernetes.io/projected/784bb395-54fe-47ab-9fd3-0298329d8566-kube-api-access-4hfkc\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.444124 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-utilities\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.444156 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-catalog-content\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.545292 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hfkc\" (UniqueName: \"kubernetes.io/projected/784bb395-54fe-47ab-9fd3-0298329d8566-kube-api-access-4hfkc\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.545767 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-utilities\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.545981 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-catalog-content\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.546440 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-utilities\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.546451 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-catalog-content\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.562807 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hfkc\" (UniqueName: \"kubernetes.io/projected/784bb395-54fe-47ab-9fd3-0298329d8566-kube-api-access-4hfkc\") pod \"redhat-marketplace-5jhgc\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.632365 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" podStartSLOduration=136.632345573 podStartE2EDuration="2m16.632345573s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:06.424519826 +0000 UTC m=+155.961772533" watchObservedRunningTime="2026-01-23 10:54:06.632345573 +0000 UTC m=+156.169598260" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.635627 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c452"] Jan 23 10:54:06 crc kubenswrapper[4957]: W0123 10:54:06.653248 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783873e0_bce9_4f05_849d_0fe265010d39.slice/crio-073ac0316bfb06567f325fbccf6bda2fef26ff25f51bf7bf248d8f27d1a69139 WatchSource:0}: Error finding container 073ac0316bfb06567f325fbccf6bda2fef26ff25f51bf7bf248d8f27d1a69139: Status 404 returned error can't find the container with id 073ac0316bfb06567f325fbccf6bda2fef26ff25f51bf7bf248d8f27d1a69139 Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.666333 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.721169 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.748100 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-secret-volume\") pod \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.748184 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-config-volume\") pod \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.748222 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lncns\" (UniqueName: \"kubernetes.io/projected/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-kube-api-access-lncns\") pod \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\" (UID: \"757f7f5f-b2fc-4fa0-b59b-63ad3548a307\") " Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.749314 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-config-volume" (OuterVolumeSpecName: "config-volume") pod "757f7f5f-b2fc-4fa0-b59b-63ad3548a307" (UID: "757f7f5f-b2fc-4fa0-b59b-63ad3548a307"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.754950 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-kube-api-access-lncns" (OuterVolumeSpecName: "kube-api-access-lncns") pod "757f7f5f-b2fc-4fa0-b59b-63ad3548a307" (UID: "757f7f5f-b2fc-4fa0-b59b-63ad3548a307"). InnerVolumeSpecName "kube-api-access-lncns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.755185 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "757f7f5f-b2fc-4fa0-b59b-63ad3548a307" (UID: "757f7f5f-b2fc-4fa0-b59b-63ad3548a307"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.770912 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:06 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:06 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:06 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.770954 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.790409 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.855183 4957 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.856101 4957 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-config-volume\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.856309 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lncns\" (UniqueName: \"kubernetes.io/projected/757f7f5f-b2fc-4fa0-b59b-63ad3548a307-kube-api-access-lncns\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.891069 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-d94z6" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.951027 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l54d2"] Jan 23 10:54:06 crc kubenswrapper[4957]: E0123 10:54:06.951225 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757f7f5f-b2fc-4fa0-b59b-63ad3548a307" containerName="collect-profiles" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.951240 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="757f7f5f-b2fc-4fa0-b59b-63ad3548a307" containerName="collect-profiles" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.951391 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="757f7f5f-b2fc-4fa0-b59b-63ad3548a307" containerName="collect-profiles" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.952047 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.957526 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.963057 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l54d2"] Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.981869 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.981897 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.987147 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.987478 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.989957 4957 patch_prober.go:28] interesting pod/console-f9d7485db-frdsx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.989991 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-frdsx" podUID="b721a986-d921-41f6-ba96-47647e168858" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Jan 23 10:54:06 crc kubenswrapper[4957]: I0123 10:54:06.992418 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.022557 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jhgc"] Jan 23 10:54:07 crc kubenswrapper[4957]: W0123 10:54:07.034979 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784bb395_54fe_47ab_9fd3_0298329d8566.slice/crio-8374321db367b7447922d4e6fd9cd26e78708d2cd6b9059804b69b4731434fed WatchSource:0}: Error finding container 8374321db367b7447922d4e6fd9cd26e78708d2cd6b9059804b69b4731434fed: Status 404 returned error can't find the container with id 8374321db367b7447922d4e6fd9cd26e78708d2cd6b9059804b69b4731434fed Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.062778 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-catalog-content\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.063042 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-kube-api-access-km4jv\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.063078 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-utilities\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.160699 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.160737 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.164375 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-catalog-content\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.164459 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-kube-api-access-km4jv\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.164484 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-utilities\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.164909 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-utilities\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.166439 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-catalog-content\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.169239 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.182140 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-kube-api-access-km4jv\") pod \"redhat-operators-l54d2\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.291441 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.311375 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rxncn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.356010 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f5qpn"] Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.357448 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.362852 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5qpn"] Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.426645 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"89d5d26c-8d52-47a1-ad35-7b23eecf210e","Type":"ContainerStarted","Data":"0356a953de35192689a4efecf7dea62a864a37066621e83c75a039e1c9570160"} Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.431535 4957 generic.go:334] "Generic (PLEG): container finished" podID="784bb395-54fe-47ab-9fd3-0298329d8566" containerID="dbf96a68664a0fab9835c3fe7a05995dffe94ef88eabd3e62c2f3ba960e97f71" exitCode=0 Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.431672 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jhgc" event={"ID":"784bb395-54fe-47ab-9fd3-0298329d8566","Type":"ContainerDied","Data":"dbf96a68664a0fab9835c3fe7a05995dffe94ef88eabd3e62c2f3ba960e97f71"} Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.431696 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jhgc" event={"ID":"784bb395-54fe-47ab-9fd3-0298329d8566","Type":"ContainerStarted","Data":"8374321db367b7447922d4e6fd9cd26e78708d2cd6b9059804b69b4731434fed"} Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.440347 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" event={"ID":"757f7f5f-b2fc-4fa0-b59b-63ad3548a307","Type":"ContainerDied","Data":"01fbe8a78f7bb32fbfe3016b603be8726970d9abe318cffafdbd39f3382e2270"} Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.440398 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01fbe8a78f7bb32fbfe3016b603be8726970d9abe318cffafdbd39f3382e2270" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.440491 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486085-pm58b" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.442706 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.442690604 podStartE2EDuration="2.442690604s" podCreationTimestamp="2026-01-23 10:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:07.439486069 +0000 UTC m=+156.976738756" watchObservedRunningTime="2026-01-23 10:54:07.442690604 +0000 UTC m=+156.979943281" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.443289 4957 generic.go:334] "Generic (PLEG): container finished" podID="783873e0-bce9-4f05-849d-0fe265010d39" containerID="bd860e2065ccae8389202d5b4fbc4f16bb131b9680fbe7cb3fcdd9c3cb491b50" exitCode=0 Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.443634 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c452" event={"ID":"783873e0-bce9-4f05-849d-0fe265010d39","Type":"ContainerDied","Data":"bd860e2065ccae8389202d5b4fbc4f16bb131b9680fbe7cb3fcdd9c3cb491b50"} Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.443668 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c452" event={"ID":"783873e0-bce9-4f05-849d-0fe265010d39","Type":"ContainerStarted","Data":"073ac0316bfb06567f325fbccf6bda2fef26ff25f51bf7bf248d8f27d1a69139"} Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.448706 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5t9v6" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.458238 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fh2f4" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.467598 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-utilities\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.467645 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx99p\" (UniqueName: \"kubernetes.io/projected/c0641b36-685a-4625-93cb-a6159de3628e-kube-api-access-vx99p\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.467666 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-catalog-content\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.570899 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx99p\" (UniqueName: \"kubernetes.io/projected/c0641b36-685a-4625-93cb-a6159de3628e-kube-api-access-vx99p\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.571255 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-catalog-content\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.572119 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-catalog-content\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.572677 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-utilities\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.579509 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-utilities\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.611026 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx99p\" (UniqueName: \"kubernetes.io/projected/c0641b36-685a-4625-93cb-a6159de3628e-kube-api-access-vx99p\") pod \"redhat-operators-f5qpn\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.684784 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.724475 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l54d2"] Jan 23 10:54:07 crc kubenswrapper[4957]: W0123 10:54:07.730573 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode16b20f4_5d6a_4cf6_878a_3cf03afe72bb.slice/crio-8d56279619a1a1833f9d2e9ae863fc325299b3e581d8017e3e0b67b97c6b89e5 WatchSource:0}: Error finding container 8d56279619a1a1833f9d2e9ae863fc325299b3e581d8017e3e0b67b97c6b89e5: Status 404 returned error can't find the container with id 8d56279619a1a1833f9d2e9ae863fc325299b3e581d8017e3e0b67b97c6b89e5 Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.768631 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.770681 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:07 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:07 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:07 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:07 crc kubenswrapper[4957]: I0123 10:54:07.770731 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.123096 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.163047 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5qpn"] Jan 23 10:54:08 crc kubenswrapper[4957]: W0123 10:54:08.172872 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0641b36_685a_4625_93cb_a6159de3628e.slice/crio-70c23d98a5a2dc19892529565de66f4100c86ed8948466d5606dff009e1f418a WatchSource:0}: Error finding container 70c23d98a5a2dc19892529565de66f4100c86ed8948466d5606dff009e1f418a: Status 404 returned error can't find the container with id 70c23d98a5a2dc19892529565de66f4100c86ed8948466d5606dff009e1f418a Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.483684 4957 generic.go:334] "Generic (PLEG): container finished" podID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerID="db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916" exitCode=0 Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.483745 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l54d2" event={"ID":"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb","Type":"ContainerDied","Data":"db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916"} Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.484015 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l54d2" event={"ID":"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb","Type":"ContainerStarted","Data":"8d56279619a1a1833f9d2e9ae863fc325299b3e581d8017e3e0b67b97c6b89e5"} Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.490410 4957 generic.go:334] "Generic (PLEG): container finished" podID="89d5d26c-8d52-47a1-ad35-7b23eecf210e" containerID="0356a953de35192689a4efecf7dea62a864a37066621e83c75a039e1c9570160" exitCode=0 Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.490492 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"89d5d26c-8d52-47a1-ad35-7b23eecf210e","Type":"ContainerDied","Data":"0356a953de35192689a4efecf7dea62a864a37066621e83c75a039e1c9570160"} Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.495215 4957 generic.go:334] "Generic (PLEG): container finished" podID="c0641b36-685a-4625-93cb-a6159de3628e" containerID="e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897" exitCode=0 Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.497999 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5qpn" event={"ID":"c0641b36-685a-4625-93cb-a6159de3628e","Type":"ContainerDied","Data":"e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897"} Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.498031 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5qpn" event={"ID":"c0641b36-685a-4625-93cb-a6159de3628e","Type":"ContainerStarted","Data":"70c23d98a5a2dc19892529565de66f4100c86ed8948466d5606dff009e1f418a"} Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.773025 4957 patch_prober.go:28] interesting pod/router-default-5444994796-6brn9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 23 10:54:08 crc kubenswrapper[4957]: [-]has-synced failed: reason withheld Jan 23 10:54:08 crc kubenswrapper[4957]: [+]process-running ok Jan 23 10:54:08 crc kubenswrapper[4957]: healthz check failed Jan 23 10:54:08 crc kubenswrapper[4957]: I0123 10:54:08.773130 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6brn9" podUID="beac68ed-ac4c-424e-aaff-79a4f7f246d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 10:54:09 crc kubenswrapper[4957]: I0123 10:54:09.571475 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 10:54:09 crc kubenswrapper[4957]: I0123 10:54:09.774557 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:54:09 crc kubenswrapper[4957]: I0123 10:54:09.780844 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6brn9" Jan 23 10:54:09 crc kubenswrapper[4957]: I0123 10:54:09.892946 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.013388 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kube-api-access\") pod \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.013493 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kubelet-dir\") pod \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\" (UID: \"89d5d26c-8d52-47a1-ad35-7b23eecf210e\") " Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.013853 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "89d5d26c-8d52-47a1-ad35-7b23eecf210e" (UID: "89d5d26c-8d52-47a1-ad35-7b23eecf210e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.019133 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "89d5d26c-8d52-47a1-ad35-7b23eecf210e" (UID: "89d5d26c-8d52-47a1-ad35-7b23eecf210e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.115208 4957 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.115248 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d5d26c-8d52-47a1-ad35-7b23eecf210e-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.548494 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"89d5d26c-8d52-47a1-ad35-7b23eecf210e","Type":"ContainerDied","Data":"1dd65bd4e9aa3eaf7de7c9c5433e5d2a371fbc65791fd1da5977506ee3e0c5c6"} Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.548543 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd65bd4e9aa3eaf7de7c9c5433e5d2a371fbc65791fd1da5977506ee3e0c5c6" Jan 23 10:54:10 crc kubenswrapper[4957]: I0123 10:54:10.548942 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.138182 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 23 10:54:11 crc kubenswrapper[4957]: E0123 10:54:11.138506 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d5d26c-8d52-47a1-ad35-7b23eecf210e" containerName="pruner" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.138564 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d5d26c-8d52-47a1-ad35-7b23eecf210e" containerName="pruner" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.138753 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d5d26c-8d52-47a1-ad35-7b23eecf210e" containerName="pruner" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.141910 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.146432 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.146739 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.161726 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.240151 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b133eb5c-19a7-4597-a201-65d2268a69ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.240220 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b133eb5c-19a7-4597-a201-65d2268a69ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.341369 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b133eb5c-19a7-4597-a201-65d2268a69ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.341474 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b133eb5c-19a7-4597-a201-65d2268a69ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.341511 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b133eb5c-19a7-4597-a201-65d2268a69ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.363537 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b133eb5c-19a7-4597-a201-65d2268a69ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.474356 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:11 crc kubenswrapper[4957]: I0123 10:54:11.692941 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 23 10:54:11 crc kubenswrapper[4957]: W0123 10:54:11.752065 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb133eb5c_19a7_4597_a201_65d2268a69ae.slice/crio-0afad2a05a1339ae69e021650e97667c2049d3e1987db1a271948941b6f95a04 WatchSource:0}: Error finding container 0afad2a05a1339ae69e021650e97667c2049d3e1987db1a271948941b6f95a04: Status 404 returned error can't find the container with id 0afad2a05a1339ae69e021650e97667c2049d3e1987db1a271948941b6f95a04 Jan 23 10:54:12 crc kubenswrapper[4957]: I0123 10:54:12.559511 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:54:12 crc kubenswrapper[4957]: I0123 10:54:12.572986 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87775b38-0664-48f6-8857-7568c135bd79-metrics-certs\") pod \"network-metrics-daemon-5fxfb\" (UID: \"87775b38-0664-48f6-8857-7568c135bd79\") " pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:54:12 crc kubenswrapper[4957]: I0123 10:54:12.581499 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b133eb5c-19a7-4597-a201-65d2268a69ae","Type":"ContainerStarted","Data":"0afad2a05a1339ae69e021650e97667c2049d3e1987db1a271948941b6f95a04"} Jan 23 10:54:12 crc kubenswrapper[4957]: I0123 10:54:12.801415 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5fxfb" Jan 23 10:54:13 crc kubenswrapper[4957]: I0123 10:54:13.311791 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5fxfb"] Jan 23 10:54:13 crc kubenswrapper[4957]: I0123 10:54:13.314780 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rxncn" Jan 23 10:54:13 crc kubenswrapper[4957]: W0123 10:54:13.342433 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87775b38_0664_48f6_8857_7568c135bd79.slice/crio-59d6a39ae8ceeabe12aabf69342b019edbb8ed3c84fa6b4174435bf62a1a61d2 WatchSource:0}: Error finding container 59d6a39ae8ceeabe12aabf69342b019edbb8ed3c84fa6b4174435bf62a1a61d2: Status 404 returned error can't find the container with id 59d6a39ae8ceeabe12aabf69342b019edbb8ed3c84fa6b4174435bf62a1a61d2 Jan 23 10:54:13 crc kubenswrapper[4957]: I0123 10:54:13.601483 4957 generic.go:334] "Generic (PLEG): container finished" podID="b133eb5c-19a7-4597-a201-65d2268a69ae" containerID="71203ca6b7d0145acb43698200754665c1b36e94e244e48e825a84c0bd0bea01" exitCode=0 Jan 23 10:54:13 crc kubenswrapper[4957]: I0123 10:54:13.601582 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b133eb5c-19a7-4597-a201-65d2268a69ae","Type":"ContainerDied","Data":"71203ca6b7d0145acb43698200754665c1b36e94e244e48e825a84c0bd0bea01"} Jan 23 10:54:13 crc kubenswrapper[4957]: I0123 10:54:13.603628 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" event={"ID":"87775b38-0664-48f6-8857-7568c135bd79","Type":"ContainerStarted","Data":"59d6a39ae8ceeabe12aabf69342b019edbb8ed3c84fa6b4174435bf62a1a61d2"} Jan 23 10:54:14 crc kubenswrapper[4957]: I0123 10:54:14.613556 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" event={"ID":"87775b38-0664-48f6-8857-7568c135bd79","Type":"ContainerStarted","Data":"c623988e4b416828bdc7868f9088aa5bd2ff447d467e93bfd41259666daba66e"} Jan 23 10:54:14 crc kubenswrapper[4957]: I0123 10:54:14.932559 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.104930 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b133eb5c-19a7-4597-a201-65d2268a69ae-kube-api-access\") pod \"b133eb5c-19a7-4597-a201-65d2268a69ae\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.105011 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b133eb5c-19a7-4597-a201-65d2268a69ae-kubelet-dir\") pod \"b133eb5c-19a7-4597-a201-65d2268a69ae\" (UID: \"b133eb5c-19a7-4597-a201-65d2268a69ae\") " Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.105241 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b133eb5c-19a7-4597-a201-65d2268a69ae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b133eb5c-19a7-4597-a201-65d2268a69ae" (UID: "b133eb5c-19a7-4597-a201-65d2268a69ae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.123201 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b133eb5c-19a7-4597-a201-65d2268a69ae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b133eb5c-19a7-4597-a201-65d2268a69ae" (UID: "b133eb5c-19a7-4597-a201-65d2268a69ae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.206323 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b133eb5c-19a7-4597-a201-65d2268a69ae-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.206365 4957 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b133eb5c-19a7-4597-a201-65d2268a69ae-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.621172 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5fxfb" event={"ID":"87775b38-0664-48f6-8857-7568c135bd79","Type":"ContainerStarted","Data":"b665ee9c81546dc74bb22c6a874288e9e174fbf27e65b45e5c9100d9e89bd079"} Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.625058 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b133eb5c-19a7-4597-a201-65d2268a69ae","Type":"ContainerDied","Data":"0afad2a05a1339ae69e021650e97667c2049d3e1987db1a271948941b6f95a04"} Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.625091 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0afad2a05a1339ae69e021650e97667c2049d3e1987db1a271948941b6f95a04" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.625133 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.639080 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5fxfb" podStartSLOduration=145.639067317 podStartE2EDuration="2m25.639067317s" podCreationTimestamp="2026-01-23 10:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:15.638700807 +0000 UTC m=+165.175953494" watchObservedRunningTime="2026-01-23 10:54:15.639067317 +0000 UTC m=+165.176320004" Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.716871 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 10:54:15 crc kubenswrapper[4957]: I0123 10:54:15.716922 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 10:54:17 crc kubenswrapper[4957]: I0123 10:54:17.091841 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:54:17 crc kubenswrapper[4957]: I0123 10:54:17.095922 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-frdsx" Jan 23 10:54:19 crc kubenswrapper[4957]: I0123 10:54:19.557594 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqcmp"] Jan 23 10:54:19 crc kubenswrapper[4957]: I0123 10:54:19.558158 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" podUID="ea6adf64-ce45-4965-8c43-0f0215c18109" containerName="controller-manager" containerID="cri-o://47d214baa4f3aea0bf8b97cf2de29bf328345dc34ec9b06c97867d6d4108ea34" gracePeriod=30 Jan 23 10:54:19 crc kubenswrapper[4957]: I0123 10:54:19.579135 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7"] Jan 23 10:54:19 crc kubenswrapper[4957]: I0123 10:54:19.579343 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" podUID="f6bd9bee-ee36-4280-948c-f39f88acaa70" containerName="route-controller-manager" containerID="cri-o://372478e4e90cdb2660fa7b912f5f3461e60b2ae637861911896bf3ef647f70e7" gracePeriod=30 Jan 23 10:54:21 crc kubenswrapper[4957]: I0123 10:54:21.661812 4957 generic.go:334] "Generic (PLEG): container finished" podID="f6bd9bee-ee36-4280-948c-f39f88acaa70" containerID="372478e4e90cdb2660fa7b912f5f3461e60b2ae637861911896bf3ef647f70e7" exitCode=0 Jan 23 10:54:21 crc kubenswrapper[4957]: I0123 10:54:21.661936 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" event={"ID":"f6bd9bee-ee36-4280-948c-f39f88acaa70","Type":"ContainerDied","Data":"372478e4e90cdb2660fa7b912f5f3461e60b2ae637861911896bf3ef647f70e7"} Jan 23 10:54:21 crc kubenswrapper[4957]: I0123 10:54:21.665517 4957 generic.go:334] "Generic (PLEG): container finished" podID="ea6adf64-ce45-4965-8c43-0f0215c18109" containerID="47d214baa4f3aea0bf8b97cf2de29bf328345dc34ec9b06c97867d6d4108ea34" exitCode=0 Jan 23 10:54:21 crc kubenswrapper[4957]: I0123 10:54:21.665561 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" event={"ID":"ea6adf64-ce45-4965-8c43-0f0215c18109","Type":"ContainerDied","Data":"47d214baa4f3aea0bf8b97cf2de29bf328345dc34ec9b06c97867d6d4108ea34"} Jan 23 10:54:25 crc kubenswrapper[4957]: I0123 10:54:25.281724 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:54:28 crc kubenswrapper[4957]: I0123 10:54:28.279496 4957 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-z7qr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: i/o timeout" start-of-body= Jan 23 10:54:28 crc kubenswrapper[4957]: I0123 10:54:28.279864 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" podUID="f6bd9bee-ee36-4280-948c-f39f88acaa70" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: i/o timeout" Jan 23 10:54:28 crc kubenswrapper[4957]: I0123 10:54:28.282998 4957 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cqcmp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: i/o timeout" start-of-body= Jan 23 10:54:28 crc kubenswrapper[4957]: I0123 10:54:28.283042 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" podUID="ea6adf64-ce45-4965-8c43-0f0215c18109" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: i/o timeout" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.025858 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.033466 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.053266 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-856dc49d-tv2rw"] Jan 23 10:54:31 crc kubenswrapper[4957]: E0123 10:54:31.053486 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b133eb5c-19a7-4597-a201-65d2268a69ae" containerName="pruner" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.053497 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="b133eb5c-19a7-4597-a201-65d2268a69ae" containerName="pruner" Jan 23 10:54:31 crc kubenswrapper[4957]: E0123 10:54:31.053507 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bd9bee-ee36-4280-948c-f39f88acaa70" containerName="route-controller-manager" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.053514 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bd9bee-ee36-4280-948c-f39f88acaa70" containerName="route-controller-manager" Jan 23 10:54:31 crc kubenswrapper[4957]: E0123 10:54:31.053526 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6adf64-ce45-4965-8c43-0f0215c18109" containerName="controller-manager" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.053533 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6adf64-ce45-4965-8c43-0f0215c18109" containerName="controller-manager" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.053617 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6adf64-ce45-4965-8c43-0f0215c18109" containerName="controller-manager" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.053629 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="b133eb5c-19a7-4597-a201-65d2268a69ae" containerName="pruner" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.053638 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bd9bee-ee36-4280-948c-f39f88acaa70" containerName="route-controller-manager" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.054140 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.069377 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-856dc49d-tv2rw"] Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.127900 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-config\") pod \"ea6adf64-ce45-4965-8c43-0f0215c18109\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.127944 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-client-ca\") pod \"ea6adf64-ce45-4965-8c43-0f0215c18109\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.127970 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6adf64-ce45-4965-8c43-0f0215c18109-serving-cert\") pod \"ea6adf64-ce45-4965-8c43-0f0215c18109\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.127987 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk58c\" (UniqueName: \"kubernetes.io/projected/ea6adf64-ce45-4965-8c43-0f0215c18109-kube-api-access-lk58c\") pod \"ea6adf64-ce45-4965-8c43-0f0215c18109\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.128064 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-proxy-ca-bundles\") pod \"ea6adf64-ce45-4965-8c43-0f0215c18109\" (UID: \"ea6adf64-ce45-4965-8c43-0f0215c18109\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.128515 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-client-ca\") pod \"f6bd9bee-ee36-4280-948c-f39f88acaa70\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.128898 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ea6adf64-ce45-4965-8c43-0f0215c18109" (UID: "ea6adf64-ce45-4965-8c43-0f0215c18109"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.129177 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea6adf64-ce45-4965-8c43-0f0215c18109" (UID: "ea6adf64-ce45-4965-8c43-0f0215c18109"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.129562 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-client-ca" (OuterVolumeSpecName: "client-ca") pod "f6bd9bee-ee36-4280-948c-f39f88acaa70" (UID: "f6bd9bee-ee36-4280-948c-f39f88acaa70"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.129713 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-config" (OuterVolumeSpecName: "config") pod "ea6adf64-ce45-4965-8c43-0f0215c18109" (UID: "ea6adf64-ce45-4965-8c43-0f0215c18109"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.129803 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-client-ca\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.129881 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-config\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.129940 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-proxy-ca-bundles\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.130019 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a744ab-4b97-4bbc-a496-0a665b6eba74-serving-cert\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.130099 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s57zb\" (UniqueName: \"kubernetes.io/projected/b6a744ab-4b97-4bbc-a496-0a665b6eba74-kube-api-access-s57zb\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.130167 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.130184 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.130206 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.130219 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6adf64-ce45-4965-8c43-0f0215c18109-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.134803 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6adf64-ce45-4965-8c43-0f0215c18109-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea6adf64-ce45-4965-8c43-0f0215c18109" (UID: "ea6adf64-ce45-4965-8c43-0f0215c18109"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.138102 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6adf64-ce45-4965-8c43-0f0215c18109-kube-api-access-lk58c" (OuterVolumeSpecName: "kube-api-access-lk58c") pod "ea6adf64-ce45-4965-8c43-0f0215c18109" (UID: "ea6adf64-ce45-4965-8c43-0f0215c18109"). InnerVolumeSpecName "kube-api-access-lk58c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.230908 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bd9bee-ee36-4280-948c-f39f88acaa70-serving-cert\") pod \"f6bd9bee-ee36-4280-948c-f39f88acaa70\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.230980 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-config\") pod \"f6bd9bee-ee36-4280-948c-f39f88acaa70\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231039 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpvql\" (UniqueName: \"kubernetes.io/projected/f6bd9bee-ee36-4280-948c-f39f88acaa70-kube-api-access-fpvql\") pod \"f6bd9bee-ee36-4280-948c-f39f88acaa70\" (UID: \"f6bd9bee-ee36-4280-948c-f39f88acaa70\") " Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231142 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a744ab-4b97-4bbc-a496-0a665b6eba74-serving-cert\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231179 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s57zb\" (UniqueName: \"kubernetes.io/projected/b6a744ab-4b97-4bbc-a496-0a665b6eba74-kube-api-access-s57zb\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231214 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-client-ca\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231237 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-config\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231254 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-proxy-ca-bundles\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231314 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea6adf64-ce45-4965-8c43-0f0215c18109-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231324 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk58c\" (UniqueName: \"kubernetes.io/projected/ea6adf64-ce45-4965-8c43-0f0215c18109-kube-api-access-lk58c\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.231629 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-config" (OuterVolumeSpecName: "config") pod "f6bd9bee-ee36-4280-948c-f39f88acaa70" (UID: "f6bd9bee-ee36-4280-948c-f39f88acaa70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.232594 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-proxy-ca-bundles\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.233069 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-config\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.235392 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a744ab-4b97-4bbc-a496-0a665b6eba74-serving-cert\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.235498 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bd9bee-ee36-4280-948c-f39f88acaa70-kube-api-access-fpvql" (OuterVolumeSpecName: "kube-api-access-fpvql") pod "f6bd9bee-ee36-4280-948c-f39f88acaa70" (UID: "f6bd9bee-ee36-4280-948c-f39f88acaa70"). InnerVolumeSpecName "kube-api-access-fpvql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.236181 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6bd9bee-ee36-4280-948c-f39f88acaa70-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f6bd9bee-ee36-4280-948c-f39f88acaa70" (UID: "f6bd9bee-ee36-4280-948c-f39f88acaa70"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.248455 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s57zb\" (UniqueName: \"kubernetes.io/projected/b6a744ab-4b97-4bbc-a496-0a665b6eba74-kube-api-access-s57zb\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.310788 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" event={"ID":"ea6adf64-ce45-4965-8c43-0f0215c18109","Type":"ContainerDied","Data":"40448eee897d34ffcafc921680edc9433eade05f4f35601a9217be7e41f2d194"} Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.310818 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqcmp" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.310872 4957 scope.go:117] "RemoveContainer" containerID="47d214baa4f3aea0bf8b97cf2de29bf328345dc34ec9b06c97867d6d4108ea34" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.315160 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" event={"ID":"f6bd9bee-ee36-4280-948c-f39f88acaa70","Type":"ContainerDied","Data":"2e0f477e169be6359210eeffdf587282cce727f254e63542448a9708973adcee"} Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.315251 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.323983 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-client-ca\") pod \"controller-manager-856dc49d-tv2rw\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.335115 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bd9bee-ee36-4280-948c-f39f88acaa70-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.335180 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bd9bee-ee36-4280-948c-f39f88acaa70-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.335194 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpvql\" (UniqueName: \"kubernetes.io/projected/f6bd9bee-ee36-4280-948c-f39f88acaa70-kube-api-access-fpvql\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.341306 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqcmp"] Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.345552 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqcmp"] Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.353372 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7"] Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.356410 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z7qr7"] Jan 23 10:54:31 crc kubenswrapper[4957]: I0123 10:54:31.421548 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:32 crc kubenswrapper[4957]: I0123 10:54:32.788618 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6adf64-ce45-4965-8c43-0f0215c18109" path="/var/lib/kubelet/pods/ea6adf64-ce45-4965-8c43-0f0215c18109/volumes" Jan 23 10:54:32 crc kubenswrapper[4957]: I0123 10:54:32.789911 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bd9bee-ee36-4280-948c-f39f88acaa70" path="/var/lib/kubelet/pods/f6bd9bee-ee36-4280-948c-f39f88acaa70/volumes" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.547786 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct"] Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.548643 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.551876 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.552660 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.552719 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.552727 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.552752 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.553393 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.561868 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql9fs\" (UniqueName: \"kubernetes.io/projected/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-kube-api-access-ql9fs\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.561932 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-client-ca\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.561959 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-config\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.562003 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-serving-cert\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.562720 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct"] Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.663104 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql9fs\" (UniqueName: \"kubernetes.io/projected/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-kube-api-access-ql9fs\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.663212 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-client-ca\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.663243 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-config\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.664518 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-client-ca\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.664824 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-config\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.665126 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-serving-cert\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.674454 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-serving-cert\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.691508 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql9fs\" (UniqueName: \"kubernetes.io/projected/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-kube-api-access-ql9fs\") pod \"route-controller-manager-6b469fb7f8-wd5ct\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:33 crc kubenswrapper[4957]: I0123 10:54:33.872153 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:37 crc kubenswrapper[4957]: I0123 10:54:37.569761 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 23 10:54:37 crc kubenswrapper[4957]: I0123 10:54:37.885172 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jp7cg" Jan 23 10:54:39 crc kubenswrapper[4957]: I0123 10:54:39.462421 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-856dc49d-tv2rw"] Jan 23 10:54:39 crc kubenswrapper[4957]: I0123 10:54:39.491384 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct"] Jan 23 10:54:40 crc kubenswrapper[4957]: E0123 10:54:40.778546 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 23 10:54:40 crc kubenswrapper[4957]: E0123 10:54:40.778738 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hsspd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tbq4h_openshift-marketplace(6e09bef4-a911-4771-b0e4-233eba62eddf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:40 crc kubenswrapper[4957]: E0123 10:54:40.779944 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tbq4h" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" Jan 23 10:54:45 crc kubenswrapper[4957]: I0123 10:54:45.717129 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 10:54:45 crc kubenswrapper[4957]: I0123 10:54:45.717519 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 10:54:45 crc kubenswrapper[4957]: E0123 10:54:45.855549 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tbq4h" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" Jan 23 10:54:45 crc kubenswrapper[4957]: E0123 10:54:45.951542 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 23 10:54:45 crc kubenswrapper[4957]: E0123 10:54:45.951719 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4hfkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5jhgc_openshift-marketplace(784bb395-54fe-47ab-9fd3-0298329d8566): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:45 crc kubenswrapper[4957]: E0123 10:54:45.952933 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5jhgc" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.338165 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.339124 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.343199 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.343433 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.353478 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.472578 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de5b5c80-661f-4108-becb-a9e0598ff438-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.472736 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5b5c80-661f-4108-becb-a9e0598ff438-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.574406 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de5b5c80-661f-4108-becb-a9e0598ff438-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.574499 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5b5c80-661f-4108-becb-a9e0598ff438-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.574677 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de5b5c80-661f-4108-becb-a9e0598ff438-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.610152 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5b5c80-661f-4108-becb-a9e0598ff438-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:46 crc kubenswrapper[4957]: I0123 10:54:46.667891 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:50 crc kubenswrapper[4957]: E0123 10:54:50.700473 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5jhgc" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.571840 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.572214 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xnnbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l7xvl_openshift-marketplace(4210122d-91b9-4890-a8d0-23e71c42d121): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.573657 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l7xvl" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.638814 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.638990 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g5k4m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5zrcr_openshift-marketplace(fd5862ba-aabd-4c6f-b546-3f22d40592b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.640820 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5zrcr" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.718124 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.718345 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nwt4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-87khr_openshift-marketplace(24be1c19-2466-4e1e-9232-9697534b5d6e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:51 crc kubenswrapper[4957]: E0123 10:54:51.719677 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-87khr" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.731413 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.732348 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.745180 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.843866 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.844005 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-var-lock\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.844064 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kube-api-access\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.945135 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-var-lock\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.945205 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kube-api-access\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.945245 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.945253 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-var-lock\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.945320 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:51 crc kubenswrapper[4957]: I0123 10:54:51.964947 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kube-api-access\") pod \"installer-9-crc\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:52 crc kubenswrapper[4957]: I0123 10:54:52.052613 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:54:53 crc kubenswrapper[4957]: E0123 10:54:53.606663 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 23 10:54:53 crc kubenswrapper[4957]: E0123 10:54:53.606999 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4xnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2c452_openshift-marketplace(783873e0-bce9-4f05-849d-0fe265010d39): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:53 crc kubenswrapper[4957]: E0123 10:54:53.608195 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2c452" podUID="783873e0-bce9-4f05-849d-0fe265010d39" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.652431 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-87khr" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.652692 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5zrcr" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.652863 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2c452" podUID="783873e0-bce9-4f05-849d-0fe265010d39" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.652908 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l7xvl" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.694017 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.694199 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vx99p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-f5qpn_openshift-marketplace(c0641b36-685a-4625-93cb-a6159de3628e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:54 crc kubenswrapper[4957]: I0123 10:54:54.694357 4957 scope.go:117] "RemoveContainer" containerID="372478e4e90cdb2660fa7b912f5f3461e60b2ae637861911896bf3ef647f70e7" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.695273 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-f5qpn" podUID="c0641b36-685a-4625-93cb-a6159de3628e" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.745805 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.745983 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-km4jv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-l54d2_openshift-marketplace(e16b20f4-5d6a-4cf6-878a-3cf03afe72bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 23 10:54:54 crc kubenswrapper[4957]: E0123 10:54:54.747776 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-l54d2" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" Jan 23 10:54:54 crc kubenswrapper[4957]: I0123 10:54:54.921130 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct"] Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.012704 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 23 10:54:55 crc kubenswrapper[4957]: W0123 10:54:55.018211 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd86df6f9_d655_44a7_a92e_8c7bf90d92af.slice/crio-817728bd954477d298fbc9cc4f5c63d0c743cff26a7200bd59a37f5cdb1eb6bc WatchSource:0}: Error finding container 817728bd954477d298fbc9cc4f5c63d0c743cff26a7200bd59a37f5cdb1eb6bc: Status 404 returned error can't find the container with id 817728bd954477d298fbc9cc4f5c63d0c743cff26a7200bd59a37f5cdb1eb6bc Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.184238 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.188130 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-856dc49d-tv2rw"] Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.448729 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d86df6f9-d655-44a7-a92e-8c7bf90d92af","Type":"ContainerStarted","Data":"ffab0ed388b1fb0a4fd8da39cc0262d4ad82fac0ffd44d6e038643c06ff87b7f"} Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.449050 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d86df6f9-d655-44a7-a92e-8c7bf90d92af","Type":"ContainerStarted","Data":"817728bd954477d298fbc9cc4f5c63d0c743cff26a7200bd59a37f5cdb1eb6bc"} Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.451841 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" event={"ID":"77c5575a-d73b-43a9-896d-58bf5c1ebe2b","Type":"ContainerStarted","Data":"0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6"} Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.451904 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" event={"ID":"77c5575a-d73b-43a9-896d-58bf5c1ebe2b","Type":"ContainerStarted","Data":"92178987319d84ab0e43a0f5a56cdeded26029e13a76359acf805209088da355"} Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.452033 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" podUID="77c5575a-d73b-43a9-896d-58bf5c1ebe2b" containerName="route-controller-manager" containerID="cri-o://0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6" gracePeriod=30 Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.452607 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.455806 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" event={"ID":"b6a744ab-4b97-4bbc-a496-0a665b6eba74","Type":"ContainerStarted","Data":"fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8"} Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.455918 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" event={"ID":"b6a744ab-4b97-4bbc-a496-0a665b6eba74","Type":"ContainerStarted","Data":"c89efd83765aa6b5208dafa5d565a6457dedc2ec3bd569f5d554e4b19ef077c5"} Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.456126 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" podUID="b6a744ab-4b97-4bbc-a496-0a665b6eba74" containerName="controller-manager" containerID="cri-o://fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8" gracePeriod=30 Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.456386 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.461295 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.463966 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"de5b5c80-661f-4108-becb-a9e0598ff438","Type":"ContainerStarted","Data":"b71251bfd098c912e02839bb5254739507905c8b74886405596c7691db86d716"} Jan 23 10:54:55 crc kubenswrapper[4957]: E0123 10:54:55.465626 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-l54d2" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" Jan 23 10:54:55 crc kubenswrapper[4957]: E0123 10:54:55.466154 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-f5qpn" podUID="c0641b36-685a-4625-93cb-a6159de3628e" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.474237 4957 patch_prober.go:28] interesting pod/route-controller-manager-6b469fb7f8-wd5ct container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": read tcp 10.217.0.2:37560->10.217.0.55:8443: read: connection reset by peer" start-of-body= Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.474321 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" podUID="77c5575a-d73b-43a9-896d-58bf5c1ebe2b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": read tcp 10.217.0.2:37560->10.217.0.55:8443: read: connection reset by peer" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.477841 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.477823005 podStartE2EDuration="4.477823005s" podCreationTimestamp="2026-01-23 10:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:55.474721162 +0000 UTC m=+205.011973849" watchObservedRunningTime="2026-01-23 10:54:55.477823005 +0000 UTC m=+205.015075692" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.501748 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" podStartSLOduration=36.501721024 podStartE2EDuration="36.501721024s" podCreationTimestamp="2026-01-23 10:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:55.49783264 +0000 UTC m=+205.035085327" watchObservedRunningTime="2026-01-23 10:54:55.501721024 +0000 UTC m=+205.038973721" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.553637 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" podStartSLOduration=36.553616262 podStartE2EDuration="36.553616262s" podCreationTimestamp="2026-01-23 10:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:55.550134989 +0000 UTC m=+205.087387696" watchObservedRunningTime="2026-01-23 10:54:55.553616262 +0000 UTC m=+205.090868949" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.944304 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.951785 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6b469fb7f8-wd5ct_77c5575a-d73b-43a9-896d-58bf5c1ebe2b/route-controller-manager/0.log" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.951847 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.992303 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw"] Jan 23 10:54:55 crc kubenswrapper[4957]: E0123 10:54:55.993012 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c5575a-d73b-43a9-896d-58bf5c1ebe2b" containerName="route-controller-manager" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.993028 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c5575a-d73b-43a9-896d-58bf5c1ebe2b" containerName="route-controller-manager" Jan 23 10:54:55 crc kubenswrapper[4957]: E0123 10:54:55.993063 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a744ab-4b97-4bbc-a496-0a665b6eba74" containerName="controller-manager" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.993072 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a744ab-4b97-4bbc-a496-0a665b6eba74" containerName="controller-manager" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.993371 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a744ab-4b97-4bbc-a496-0a665b6eba74" containerName="controller-manager" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.993400 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c5575a-d73b-43a9-896d-58bf5c1ebe2b" containerName="route-controller-manager" Jan 23 10:54:55 crc kubenswrapper[4957]: I0123 10:54:55.994303 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.001307 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw"] Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098120 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-config\") pod \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098200 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s57zb\" (UniqueName: \"kubernetes.io/projected/b6a744ab-4b97-4bbc-a496-0a665b6eba74-kube-api-access-s57zb\") pod \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098265 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-client-ca\") pod \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098311 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-proxy-ca-bundles\") pod \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098401 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-config\") pod \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098474 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql9fs\" (UniqueName: \"kubernetes.io/projected/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-kube-api-access-ql9fs\") pod \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098520 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-serving-cert\") pod \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\" (UID: \"77c5575a-d73b-43a9-896d-58bf5c1ebe2b\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.098558 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-client-ca\") pod \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.099556 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a744ab-4b97-4bbc-a496-0a665b6eba74-serving-cert\") pod \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\" (UID: \"b6a744ab-4b97-4bbc-a496-0a665b6eba74\") " Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.099130 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b6a744ab-4b97-4bbc-a496-0a665b6eba74" (UID: "b6a744ab-4b97-4bbc-a496-0a665b6eba74"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.099735 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-serving-cert\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.099199 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6a744ab-4b97-4bbc-a496-0a665b6eba74" (UID: "b6a744ab-4b97-4bbc-a496-0a665b6eba74"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.099228 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-config" (OuterVolumeSpecName: "config") pod "77c5575a-d73b-43a9-896d-58bf5c1ebe2b" (UID: "77c5575a-d73b-43a9-896d-58bf5c1ebe2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.099260 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-config" (OuterVolumeSpecName: "config") pod "b6a744ab-4b97-4bbc-a496-0a665b6eba74" (UID: "b6a744ab-4b97-4bbc-a496-0a665b6eba74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.099316 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-client-ca" (OuterVolumeSpecName: "client-ca") pod "77c5575a-d73b-43a9-896d-58bf5c1ebe2b" (UID: "77c5575a-d73b-43a9-896d-58bf5c1ebe2b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100004 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-config\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100047 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-proxy-ca-bundles\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100099 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-client-ca\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100230 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cgz\" (UniqueName: \"kubernetes.io/projected/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-kube-api-access-f8cgz\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100616 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100641 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100655 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100669 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.100680 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a744ab-4b97-4bbc-a496-0a665b6eba74-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.103367 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a744ab-4b97-4bbc-a496-0a665b6eba74-kube-api-access-s57zb" (OuterVolumeSpecName: "kube-api-access-s57zb") pod "b6a744ab-4b97-4bbc-a496-0a665b6eba74" (UID: "b6a744ab-4b97-4bbc-a496-0a665b6eba74"). InnerVolumeSpecName "kube-api-access-s57zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.103427 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6a744ab-4b97-4bbc-a496-0a665b6eba74-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6a744ab-4b97-4bbc-a496-0a665b6eba74" (UID: "b6a744ab-4b97-4bbc-a496-0a665b6eba74"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.103640 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-kube-api-access-ql9fs" (OuterVolumeSpecName: "kube-api-access-ql9fs") pod "77c5575a-d73b-43a9-896d-58bf5c1ebe2b" (UID: "77c5575a-d73b-43a9-896d-58bf5c1ebe2b"). InnerVolumeSpecName "kube-api-access-ql9fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.103753 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "77c5575a-d73b-43a9-896d-58bf5c1ebe2b" (UID: "77c5575a-d73b-43a9-896d-58bf5c1ebe2b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.201812 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cgz\" (UniqueName: \"kubernetes.io/projected/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-kube-api-access-f8cgz\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.201878 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-serving-cert\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.201949 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-config\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.201999 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-proxy-ca-bundles\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.202032 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-client-ca\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.202090 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql9fs\" (UniqueName: \"kubernetes.io/projected/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-kube-api-access-ql9fs\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.202104 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77c5575a-d73b-43a9-896d-58bf5c1ebe2b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.202116 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a744ab-4b97-4bbc-a496-0a665b6eba74-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.202128 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s57zb\" (UniqueName: \"kubernetes.io/projected/b6a744ab-4b97-4bbc-a496-0a665b6eba74-kube-api-access-s57zb\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.203102 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-client-ca\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.203420 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-proxy-ca-bundles\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.203593 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-config\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.205221 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-serving-cert\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.222734 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cgz\" (UniqueName: \"kubernetes.io/projected/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-kube-api-access-f8cgz\") pod \"controller-manager-54ddc8bc77-xr4fw\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.320691 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.478436 4957 generic.go:334] "Generic (PLEG): container finished" podID="b6a744ab-4b97-4bbc-a496-0a665b6eba74" containerID="fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8" exitCode=0 Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.478512 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.478560 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" event={"ID":"b6a744ab-4b97-4bbc-a496-0a665b6eba74","Type":"ContainerDied","Data":"fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8"} Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.479626 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856dc49d-tv2rw" event={"ID":"b6a744ab-4b97-4bbc-a496-0a665b6eba74","Type":"ContainerDied","Data":"c89efd83765aa6b5208dafa5d565a6457dedc2ec3bd569f5d554e4b19ef077c5"} Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.479673 4957 scope.go:117] "RemoveContainer" containerID="fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.481679 4957 generic.go:334] "Generic (PLEG): container finished" podID="de5b5c80-661f-4108-becb-a9e0598ff438" containerID="c92584b17ed09ffefba1489626de8d4f497a0e8954f4ba3317d133a41b1d2474" exitCode=0 Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.481803 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"de5b5c80-661f-4108-becb-a9e0598ff438","Type":"ContainerDied","Data":"c92584b17ed09ffefba1489626de8d4f497a0e8954f4ba3317d133a41b1d2474"} Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.486475 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6b469fb7f8-wd5ct_77c5575a-d73b-43a9-896d-58bf5c1ebe2b/route-controller-manager/0.log" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.486522 4957 generic.go:334] "Generic (PLEG): container finished" podID="77c5575a-d73b-43a9-896d-58bf5c1ebe2b" containerID="0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6" exitCode=255 Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.486591 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.486623 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" event={"ID":"77c5575a-d73b-43a9-896d-58bf5c1ebe2b","Type":"ContainerDied","Data":"0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6"} Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.486657 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct" event={"ID":"77c5575a-d73b-43a9-896d-58bf5c1ebe2b","Type":"ContainerDied","Data":"92178987319d84ab0e43a0f5a56cdeded26029e13a76359acf805209088da355"} Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.503798 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw"] Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.525533 4957 scope.go:117] "RemoveContainer" containerID="fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8" Jan 23 10:54:56 crc kubenswrapper[4957]: E0123 10:54:56.526306 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8\": container with ID starting with fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8 not found: ID does not exist" containerID="fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.526342 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8"} err="failed to get container status \"fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8\": rpc error: code = NotFound desc = could not find container \"fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8\": container with ID starting with fd67508d42d188c52e1a81ba21d96e1a37b7bc774578806ab862a1ce1d7c0dd8 not found: ID does not exist" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.526395 4957 scope.go:117] "RemoveContainer" containerID="0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6" Jan 23 10:54:56 crc kubenswrapper[4957]: W0123 10:54:56.532028 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff23ec1_7d08_4d1b_bdd5_6e97a2662479.slice/crio-2d79e277f36cc5d19b47913a8215a517dab1cb5e92d7c6a0242e3f656a5ce537 WatchSource:0}: Error finding container 2d79e277f36cc5d19b47913a8215a517dab1cb5e92d7c6a0242e3f656a5ce537: Status 404 returned error can't find the container with id 2d79e277f36cc5d19b47913a8215a517dab1cb5e92d7c6a0242e3f656a5ce537 Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.534132 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-856dc49d-tv2rw"] Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.540331 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-856dc49d-tv2rw"] Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.544360 4957 scope.go:117] "RemoveContainer" containerID="0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6" Jan 23 10:54:56 crc kubenswrapper[4957]: E0123 10:54:56.544797 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6\": container with ID starting with 0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6 not found: ID does not exist" containerID="0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.544835 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6"} err="failed to get container status \"0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6\": rpc error: code = NotFound desc = could not find container \"0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6\": container with ID starting with 0b6f3a11f6f7f851a5beafaf92879d6a319f4e4b72ad4f55c9046412147b04a6 not found: ID does not exist" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.546805 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct"] Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.550501 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b469fb7f8-wd5ct"] Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.784046 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77c5575a-d73b-43a9-896d-58bf5c1ebe2b" path="/var/lib/kubelet/pods/77c5575a-d73b-43a9-896d-58bf5c1ebe2b/volumes" Jan 23 10:54:56 crc kubenswrapper[4957]: I0123 10:54:56.784984 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a744ab-4b97-4bbc-a496-0a665b6eba74" path="/var/lib/kubelet/pods/b6a744ab-4b97-4bbc-a496-0a665b6eba74/volumes" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.492586 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" event={"ID":"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479","Type":"ContainerStarted","Data":"006a101a7bfcb87a430da458e87a04fdf85da360b22196ba744994e050bb9fdb"} Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.492876 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.492907 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" event={"ID":"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479","Type":"ContainerStarted","Data":"2d79e277f36cc5d19b47913a8215a517dab1cb5e92d7c6a0242e3f656a5ce537"} Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.506708 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.523984 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" podStartSLOduration=18.52396148 podStartE2EDuration="18.52396148s" podCreationTimestamp="2026-01-23 10:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:54:57.519168242 +0000 UTC m=+207.056420949" watchObservedRunningTime="2026-01-23 10:54:57.52396148 +0000 UTC m=+207.061214187" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.687122 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.825239 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de5b5c80-661f-4108-becb-a9e0598ff438-kubelet-dir\") pod \"de5b5c80-661f-4108-becb-a9e0598ff438\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.825426 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5b5c80-661f-4108-becb-a9e0598ff438-kube-api-access\") pod \"de5b5c80-661f-4108-becb-a9e0598ff438\" (UID: \"de5b5c80-661f-4108-becb-a9e0598ff438\") " Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.825683 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de5b5c80-661f-4108-becb-a9e0598ff438-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de5b5c80-661f-4108-becb-a9e0598ff438" (UID: "de5b5c80-661f-4108-becb-a9e0598ff438"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.830995 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5b5c80-661f-4108-becb-a9e0598ff438-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de5b5c80-661f-4108-becb-a9e0598ff438" (UID: "de5b5c80-661f-4108-becb-a9e0598ff438"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.928666 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5b5c80-661f-4108-becb-a9e0598ff438-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:57 crc kubenswrapper[4957]: I0123 10:54:57.928719 4957 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de5b5c80-661f-4108-becb-a9e0598ff438-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.510875 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbq4h" event={"ID":"6e09bef4-a911-4771-b0e4-233eba62eddf","Type":"ContainerStarted","Data":"6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70"} Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.512749 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"de5b5c80-661f-4108-becb-a9e0598ff438","Type":"ContainerDied","Data":"b71251bfd098c912e02839bb5254739507905c8b74886405596c7691db86d716"} Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.512780 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b71251bfd098c912e02839bb5254739507905c8b74886405596c7691db86d716" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.512814 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.559603 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc"] Jan 23 10:54:58 crc kubenswrapper[4957]: E0123 10:54:58.560057 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b5c80-661f-4108-becb-a9e0598ff438" containerName="pruner" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.560152 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b5c80-661f-4108-becb-a9e0598ff438" containerName="pruner" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.560365 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5b5c80-661f-4108-becb-a9e0598ff438" containerName="pruner" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.560928 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.562995 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.564590 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.564601 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.566116 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.568011 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.570994 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.574685 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc"] Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.638362 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-config\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.638415 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-client-ca\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.638489 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-serving-cert\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.638583 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhdc5\" (UniqueName: \"kubernetes.io/projected/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-kube-api-access-nhdc5\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.740380 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-config\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.740434 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-client-ca\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.740493 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-serving-cert\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.740550 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhdc5\" (UniqueName: \"kubernetes.io/projected/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-kube-api-access-nhdc5\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.741526 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-client-ca\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.741605 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-config\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.747935 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-serving-cert\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.758234 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhdc5\" (UniqueName: \"kubernetes.io/projected/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-kube-api-access-nhdc5\") pod \"route-controller-manager-7486dcfd4c-szhlc\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:58 crc kubenswrapper[4957]: I0123 10:54:58.878136 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:54:59 crc kubenswrapper[4957]: I0123 10:54:59.328071 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc"] Jan 23 10:54:59 crc kubenswrapper[4957]: I0123 10:54:59.520502 4957 generic.go:334] "Generic (PLEG): container finished" podID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerID="6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70" exitCode=0 Jan 23 10:54:59 crc kubenswrapper[4957]: I0123 10:54:59.520594 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbq4h" event={"ID":"6e09bef4-a911-4771-b0e4-233eba62eddf","Type":"ContainerDied","Data":"6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70"} Jan 23 10:54:59 crc kubenswrapper[4957]: I0123 10:54:59.523088 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" event={"ID":"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9","Type":"ContainerStarted","Data":"f7770afb598658df5d1fe239be7228c6c0fbc062c5185c0eb6b9aeaaf93d1e30"} Jan 23 10:55:00 crc kubenswrapper[4957]: I0123 10:55:00.530179 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" event={"ID":"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9","Type":"ContainerStarted","Data":"c15537c2de03d6416a3916e1058fea58b2703f55ead63b91ae983eade45da094"} Jan 23 10:55:00 crc kubenswrapper[4957]: I0123 10:55:00.530693 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:55:00 crc kubenswrapper[4957]: I0123 10:55:00.533557 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbq4h" event={"ID":"6e09bef4-a911-4771-b0e4-233eba62eddf","Type":"ContainerStarted","Data":"6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f"} Jan 23 10:55:00 crc kubenswrapper[4957]: I0123 10:55:00.536041 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:55:00 crc kubenswrapper[4957]: I0123 10:55:00.548978 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" podStartSLOduration=21.548957416 podStartE2EDuration="21.548957416s" podCreationTimestamp="2026-01-23 10:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:55:00.547842136 +0000 UTC m=+210.085094833" watchObservedRunningTime="2026-01-23 10:55:00.548957416 +0000 UTC m=+210.086210103" Jan 23 10:55:00 crc kubenswrapper[4957]: I0123 10:55:00.595100 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tbq4h" podStartSLOduration=1.899582978 podStartE2EDuration="56.595076139s" podCreationTimestamp="2026-01-23 10:54:04 +0000 UTC" firstStartedPulling="2026-01-23 10:54:05.354779847 +0000 UTC m=+154.892032534" lastFinishedPulling="2026-01-23 10:55:00.050272988 +0000 UTC m=+209.587525695" observedRunningTime="2026-01-23 10:55:00.591144324 +0000 UTC m=+210.128397011" watchObservedRunningTime="2026-01-23 10:55:00.595076139 +0000 UTC m=+210.132328826" Jan 23 10:55:04 crc kubenswrapper[4957]: I0123 10:55:04.533047 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:55:04 crc kubenswrapper[4957]: I0123 10:55:04.534440 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:55:05 crc kubenswrapper[4957]: I0123 10:55:05.029917 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:55:05 crc kubenswrapper[4957]: I0123 10:55:05.563334 4957 generic.go:334] "Generic (PLEG): container finished" podID="784bb395-54fe-47ab-9fd3-0298329d8566" containerID="b4cf07f4860c65c89b13b7203a10a0a9fe192e40bbebb28b8ad7512ae527aafd" exitCode=0 Jan 23 10:55:05 crc kubenswrapper[4957]: I0123 10:55:05.563431 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jhgc" event={"ID":"784bb395-54fe-47ab-9fd3-0298329d8566","Type":"ContainerDied","Data":"b4cf07f4860c65c89b13b7203a10a0a9fe192e40bbebb28b8ad7512ae527aafd"} Jan 23 10:55:05 crc kubenswrapper[4957]: I0123 10:55:05.613368 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:55:06 crc kubenswrapper[4957]: I0123 10:55:06.408102 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbq4h"] Jan 23 10:55:06 crc kubenswrapper[4957]: I0123 10:55:06.571007 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l54d2" event={"ID":"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb","Type":"ContainerStarted","Data":"3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4"} Jan 23 10:55:06 crc kubenswrapper[4957]: I0123 10:55:06.573948 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jhgc" event={"ID":"784bb395-54fe-47ab-9fd3-0298329d8566","Type":"ContainerStarted","Data":"5c563c47d91953b035693d5af537c298d6b7f9e067659d1171e81c81c27b91bf"} Jan 23 10:55:06 crc kubenswrapper[4957]: I0123 10:55:06.607346 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jhgc" podStartSLOduration=2.076929343 podStartE2EDuration="1m0.60732074s" podCreationTimestamp="2026-01-23 10:54:06 +0000 UTC" firstStartedPulling="2026-01-23 10:54:07.433652188 +0000 UTC m=+156.970904875" lastFinishedPulling="2026-01-23 10:55:05.964043585 +0000 UTC m=+215.501296272" observedRunningTime="2026-01-23 10:55:06.606782255 +0000 UTC m=+216.144034942" watchObservedRunningTime="2026-01-23 10:55:06.60732074 +0000 UTC m=+216.144573427" Jan 23 10:55:06 crc kubenswrapper[4957]: I0123 10:55:06.722034 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:55:06 crc kubenswrapper[4957]: I0123 10:55:06.722075 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:55:07 crc kubenswrapper[4957]: I0123 10:55:07.579790 4957 generic.go:334] "Generic (PLEG): container finished" podID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerID="3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4" exitCode=0 Jan 23 10:55:07 crc kubenswrapper[4957]: I0123 10:55:07.579995 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l54d2" event={"ID":"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb","Type":"ContainerDied","Data":"3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4"} Jan 23 10:55:07 crc kubenswrapper[4957]: I0123 10:55:07.580177 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tbq4h" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="registry-server" containerID="cri-o://6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f" gracePeriod=2 Jan 23 10:55:07 crc kubenswrapper[4957]: I0123 10:55:07.764775 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5jhgc" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="registry-server" probeResult="failure" output=< Jan 23 10:55:07 crc kubenswrapper[4957]: timeout: failed to connect service ":50051" within 1s Jan 23 10:55:07 crc kubenswrapper[4957]: > Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.118027 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.163354 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-utilities\") pod \"6e09bef4-a911-4771-b0e4-233eba62eddf\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.163402 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsspd\" (UniqueName: \"kubernetes.io/projected/6e09bef4-a911-4771-b0e4-233eba62eddf-kube-api-access-hsspd\") pod \"6e09bef4-a911-4771-b0e4-233eba62eddf\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.163442 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-catalog-content\") pod \"6e09bef4-a911-4771-b0e4-233eba62eddf\" (UID: \"6e09bef4-a911-4771-b0e4-233eba62eddf\") " Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.165951 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-utilities" (OuterVolumeSpecName: "utilities") pod "6e09bef4-a911-4771-b0e4-233eba62eddf" (UID: "6e09bef4-a911-4771-b0e4-233eba62eddf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.170348 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e09bef4-a911-4771-b0e4-233eba62eddf-kube-api-access-hsspd" (OuterVolumeSpecName: "kube-api-access-hsspd") pod "6e09bef4-a911-4771-b0e4-233eba62eddf" (UID: "6e09bef4-a911-4771-b0e4-233eba62eddf"). InnerVolumeSpecName "kube-api-access-hsspd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.222077 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e09bef4-a911-4771-b0e4-233eba62eddf" (UID: "6e09bef4-a911-4771-b0e4-233eba62eddf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.265326 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.265366 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09bef4-a911-4771-b0e4-233eba62eddf-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.265378 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsspd\" (UniqueName: \"kubernetes.io/projected/6e09bef4-a911-4771-b0e4-233eba62eddf-kube-api-access-hsspd\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.588559 4957 generic.go:334] "Generic (PLEG): container finished" podID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerID="6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f" exitCode=0 Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.588605 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbq4h" event={"ID":"6e09bef4-a911-4771-b0e4-233eba62eddf","Type":"ContainerDied","Data":"6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f"} Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.588933 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbq4h" event={"ID":"6e09bef4-a911-4771-b0e4-233eba62eddf","Type":"ContainerDied","Data":"9c749650d5ed6a26e9eb1613757b6b2dae74cde7e52ef3aabf13e0a278c0ef47"} Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.588958 4957 scope.go:117] "RemoveContainer" containerID="6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.588647 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbq4h" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.591159 4957 generic.go:334] "Generic (PLEG): container finished" podID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerID="2b31fe528d36092a70294c93e7130464c754e6fb09964153ca51da5bb9c30679" exitCode=0 Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.591226 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87khr" event={"ID":"24be1c19-2466-4e1e-9232-9697534b5d6e","Type":"ContainerDied","Data":"2b31fe528d36092a70294c93e7130464c754e6fb09964153ca51da5bb9c30679"} Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.597515 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l54d2" event={"ID":"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb","Type":"ContainerStarted","Data":"469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b"} Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.605406 4957 scope.go:117] "RemoveContainer" containerID="6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.607106 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrcr" event={"ID":"fd5862ba-aabd-4c6f-b546-3f22d40592b8","Type":"ContainerStarted","Data":"31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab"} Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.635988 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l54d2" podStartSLOduration=2.849464004 podStartE2EDuration="1m2.635967148s" podCreationTimestamp="2026-01-23 10:54:06 +0000 UTC" firstStartedPulling="2026-01-23 10:54:08.48852436 +0000 UTC m=+158.025777047" lastFinishedPulling="2026-01-23 10:55:08.275027504 +0000 UTC m=+217.812280191" observedRunningTime="2026-01-23 10:55:08.63382936 +0000 UTC m=+218.171082087" watchObservedRunningTime="2026-01-23 10:55:08.635967148 +0000 UTC m=+218.173219835" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.636565 4957 scope.go:117] "RemoveContainer" containerID="d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.662859 4957 scope.go:117] "RemoveContainer" containerID="6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f" Jan 23 10:55:08 crc kubenswrapper[4957]: E0123 10:55:08.665131 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f\": container with ID starting with 6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f not found: ID does not exist" containerID="6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.665168 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f"} err="failed to get container status \"6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f\": rpc error: code = NotFound desc = could not find container \"6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f\": container with ID starting with 6d22b978597596e92147648120d1fc2690abab79b2afdb5a33943cf59fdac35f not found: ID does not exist" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.665198 4957 scope.go:117] "RemoveContainer" containerID="6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70" Jan 23 10:55:08 crc kubenswrapper[4957]: E0123 10:55:08.666318 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70\": container with ID starting with 6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70 not found: ID does not exist" containerID="6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.666347 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbq4h"] Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.666351 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70"} err="failed to get container status \"6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70\": rpc error: code = NotFound desc = could not find container \"6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70\": container with ID starting with 6fae11f0ca940c5499eadffc0ba9c700a45507908d98e2a70ca4162fa9f2de70 not found: ID does not exist" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.666370 4957 scope.go:117] "RemoveContainer" containerID="d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891" Jan 23 10:55:08 crc kubenswrapper[4957]: E0123 10:55:08.666715 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891\": container with ID starting with d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891 not found: ID does not exist" containerID="d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.666745 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891"} err="failed to get container status \"d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891\": rpc error: code = NotFound desc = could not find container \"d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891\": container with ID starting with d0929dde179d094b8eb228493c153348a41cbdfe85fac142803941e441b0e891 not found: ID does not exist" Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.674427 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tbq4h"] Jan 23 10:55:08 crc kubenswrapper[4957]: I0123 10:55:08.778887 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" path="/var/lib/kubelet/pods/6e09bef4-a911-4771-b0e4-233eba62eddf/volumes" Jan 23 10:55:09 crc kubenswrapper[4957]: I0123 10:55:09.614633 4957 generic.go:334] "Generic (PLEG): container finished" podID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerID="31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab" exitCode=0 Jan 23 10:55:09 crc kubenswrapper[4957]: I0123 10:55:09.614719 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrcr" event={"ID":"fd5862ba-aabd-4c6f-b546-3f22d40592b8","Type":"ContainerDied","Data":"31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab"} Jan 23 10:55:09 crc kubenswrapper[4957]: I0123 10:55:09.619457 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87khr" event={"ID":"24be1c19-2466-4e1e-9232-9697534b5d6e","Type":"ContainerStarted","Data":"d6752cb476d759d545ed0898d08ffad32d316c92e10f1bdd0feed559b300bbb3"} Jan 23 10:55:09 crc kubenswrapper[4957]: I0123 10:55:09.654920 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-87khr" podStartSLOduration=2.058299787 podStartE2EDuration="1m5.654900369s" podCreationTimestamp="2026-01-23 10:54:04 +0000 UTC" firstStartedPulling="2026-01-23 10:54:05.390514827 +0000 UTC m=+154.927767514" lastFinishedPulling="2026-01-23 10:55:08.987115409 +0000 UTC m=+218.524368096" observedRunningTime="2026-01-23 10:55:09.651912299 +0000 UTC m=+219.189164986" watchObservedRunningTime="2026-01-23 10:55:09.654900369 +0000 UTC m=+219.192153056" Jan 23 10:55:10 crc kubenswrapper[4957]: I0123 10:55:10.629087 4957 generic.go:334] "Generic (PLEG): container finished" podID="783873e0-bce9-4f05-849d-0fe265010d39" containerID="52369e67449d0e451f9f3a2943e0cce3d516df5029f90a17c23841d4d9715b85" exitCode=0 Jan 23 10:55:10 crc kubenswrapper[4957]: I0123 10:55:10.629173 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c452" event={"ID":"783873e0-bce9-4f05-849d-0fe265010d39","Type":"ContainerDied","Data":"52369e67449d0e451f9f3a2943e0cce3d516df5029f90a17c23841d4d9715b85"} Jan 23 10:55:11 crc kubenswrapper[4957]: I0123 10:55:11.640314 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrcr" event={"ID":"fd5862ba-aabd-4c6f-b546-3f22d40592b8","Type":"ContainerStarted","Data":"e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b"} Jan 23 10:55:11 crc kubenswrapper[4957]: I0123 10:55:11.661245 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5zrcr" podStartSLOduration=3.7539809809999998 podStartE2EDuration="1m8.66122624s" podCreationTimestamp="2026-01-23 10:54:03 +0000 UTC" firstStartedPulling="2026-01-23 10:54:05.395954349 +0000 UTC m=+154.933207036" lastFinishedPulling="2026-01-23 10:55:10.303199608 +0000 UTC m=+219.840452295" observedRunningTime="2026-01-23 10:55:11.658391984 +0000 UTC m=+221.195644671" watchObservedRunningTime="2026-01-23 10:55:11.66122624 +0000 UTC m=+221.198478927" Jan 23 10:55:12 crc kubenswrapper[4957]: I0123 10:55:12.647043 4957 generic.go:334] "Generic (PLEG): container finished" podID="c0641b36-685a-4625-93cb-a6159de3628e" containerID="fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27" exitCode=0 Jan 23 10:55:12 crc kubenswrapper[4957]: I0123 10:55:12.647120 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5qpn" event={"ID":"c0641b36-685a-4625-93cb-a6159de3628e","Type":"ContainerDied","Data":"fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27"} Jan 23 10:55:12 crc kubenswrapper[4957]: I0123 10:55:12.650729 4957 generic.go:334] "Generic (PLEG): container finished" podID="4210122d-91b9-4890-a8d0-23e71c42d121" containerID="cee261db6d544f87c37e4be56a806fcdc0605adb7511695e50fef22ad660c6af" exitCode=0 Jan 23 10:55:12 crc kubenswrapper[4957]: I0123 10:55:12.650764 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7xvl" event={"ID":"4210122d-91b9-4890-a8d0-23e71c42d121","Type":"ContainerDied","Data":"cee261db6d544f87c37e4be56a806fcdc0605adb7511695e50fef22ad660c6af"} Jan 23 10:55:14 crc kubenswrapper[4957]: I0123 10:55:14.083300 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:55:14 crc kubenswrapper[4957]: I0123 10:55:14.083693 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:55:14 crc kubenswrapper[4957]: I0123 10:55:14.133685 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:55:14 crc kubenswrapper[4957]: I0123 10:55:14.695879 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:55:14 crc kubenswrapper[4957]: I0123 10:55:14.696264 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:55:14 crc kubenswrapper[4957]: I0123 10:55:14.731324 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.046022 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5lm9"] Jan 23 10:55:15 crc kubenswrapper[4957]: E0123 10:55:15.046317 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="registry-server" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.046335 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="registry-server" Jan 23 10:55:15 crc kubenswrapper[4957]: E0123 10:55:15.046350 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="extract-content" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.046358 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="extract-content" Jan 23 10:55:15 crc kubenswrapper[4957]: E0123 10:55:15.046379 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="extract-utilities" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.046387 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="extract-utilities" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.046531 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e09bef4-a911-4771-b0e4-233eba62eddf" containerName="registry-server" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.046976 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.056532 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5lm9"] Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.156935 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4856b826-759f-4132-bfc5-d80385771e22-trusted-ca\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.157055 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.157089 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-registry-tls\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.157115 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4856b826-759f-4132-bfc5-d80385771e22-registry-certificates\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.157162 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4856b826-759f-4132-bfc5-d80385771e22-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.157261 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m5n6\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-kube-api-access-6m5n6\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.157315 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-bound-sa-token\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.157349 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4856b826-759f-4132-bfc5-d80385771e22-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.182070 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.258386 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m5n6\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-kube-api-access-6m5n6\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.258436 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-bound-sa-token\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.258475 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4856b826-759f-4132-bfc5-d80385771e22-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.258535 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4856b826-759f-4132-bfc5-d80385771e22-trusted-ca\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.258575 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-registry-tls\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.258596 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4856b826-759f-4132-bfc5-d80385771e22-registry-certificates\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.258626 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4856b826-759f-4132-bfc5-d80385771e22-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.260122 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4856b826-759f-4132-bfc5-d80385771e22-registry-certificates\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.260367 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4856b826-759f-4132-bfc5-d80385771e22-trusted-ca\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.260637 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4856b826-759f-4132-bfc5-d80385771e22-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.270159 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-registry-tls\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.275578 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4856b826-759f-4132-bfc5-d80385771e22-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.275604 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-bound-sa-token\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.276807 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m5n6\" (UniqueName: \"kubernetes.io/projected/4856b826-759f-4132-bfc5-d80385771e22-kube-api-access-6m5n6\") pod \"image-registry-66df7c8f76-p5lm9\" (UID: \"4856b826-759f-4132-bfc5-d80385771e22\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.366247 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.713263 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.716680 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.716735 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.716780 4957 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.717547 4957 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae"} pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.717618 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" containerID="cri-o://f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae" gracePeriod=600 Jan 23 10:55:15 crc kubenswrapper[4957]: I0123 10:55:15.751177 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5lm9"] Jan 23 10:55:15 crc kubenswrapper[4957]: W0123 10:55:15.756720 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4856b826_759f_4132_bfc5_d80385771e22.slice/crio-640d80c088d63ecbb95aa3adbde50c9ba0e5e168d53cc3e0f603081048b3562c WatchSource:0}: Error finding container 640d80c088d63ecbb95aa3adbde50c9ba0e5e168d53cc3e0f603081048b3562c: Status 404 returned error can't find the container with id 640d80c088d63ecbb95aa3adbde50c9ba0e5e168d53cc3e0f603081048b3562c Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.673267 4957 generic.go:334] "Generic (PLEG): container finished" podID="224e3211-1f68-4673-8975-7e71b1e513d0" containerID="f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae" exitCode=0 Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.673320 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerDied","Data":"f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae"} Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.675791 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" event={"ID":"4856b826-759f-4132-bfc5-d80385771e22","Type":"ContainerStarted","Data":"9de1e46de5ecb03a1daa37dc6977c660a1131c9196618ff940a118ee40a0e4ff"} Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.675835 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" event={"ID":"4856b826-759f-4132-bfc5-d80385771e22","Type":"ContainerStarted","Data":"640d80c088d63ecbb95aa3adbde50c9ba0e5e168d53cc3e0f603081048b3562c"} Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.679182 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c452" event={"ID":"783873e0-bce9-4f05-849d-0fe265010d39","Type":"ContainerStarted","Data":"af3967c7d4dec06fcfcebb91c59852179bacb2c68717d53b842c8fbe1af2ff7f"} Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.764023 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.783013 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2c452" podStartSLOduration=5.194208232 podStartE2EDuration="1m11.782994476s" podCreationTimestamp="2026-01-23 10:54:05 +0000 UTC" firstStartedPulling="2026-01-23 10:54:07.447320104 +0000 UTC m=+156.984572791" lastFinishedPulling="2026-01-23 10:55:14.036106338 +0000 UTC m=+223.573359035" observedRunningTime="2026-01-23 10:55:16.703521911 +0000 UTC m=+226.240774598" watchObservedRunningTime="2026-01-23 10:55:16.782994476 +0000 UTC m=+226.320247173" Jan 23 10:55:16 crc kubenswrapper[4957]: I0123 10:55:16.806705 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:55:17 crc kubenswrapper[4957]: I0123 10:55:17.292230 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:55:17 crc kubenswrapper[4957]: I0123 10:55:17.292298 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:55:17 crc kubenswrapper[4957]: I0123 10:55:17.335218 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:55:17 crc kubenswrapper[4957]: I0123 10:55:17.684861 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:17 crc kubenswrapper[4957]: I0123 10:55:17.734684 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:55:17 crc kubenswrapper[4957]: I0123 10:55:17.749560 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" podStartSLOduration=2.749539838 podStartE2EDuration="2.749539838s" podCreationTimestamp="2026-01-23 10:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:55:17.707528494 +0000 UTC m=+227.244781181" watchObservedRunningTime="2026-01-23 10:55:17.749539838 +0000 UTC m=+227.286792525" Jan 23 10:55:18 crc kubenswrapper[4957]: I0123 10:55:18.806562 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87khr"] Jan 23 10:55:18 crc kubenswrapper[4957]: I0123 10:55:18.807195 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-87khr" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="registry-server" containerID="cri-o://d6752cb476d759d545ed0898d08ffad32d316c92e10f1bdd0feed559b300bbb3" gracePeriod=2 Jan 23 10:55:19 crc kubenswrapper[4957]: I0123 10:55:19.438833 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw"] Jan 23 10:55:19 crc kubenswrapper[4957]: I0123 10:55:19.439079 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" podUID="5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" containerName="controller-manager" containerID="cri-o://006a101a7bfcb87a430da458e87a04fdf85da360b22196ba744994e050bb9fdb" gracePeriod=30 Jan 23 10:55:19 crc kubenswrapper[4957]: I0123 10:55:19.537511 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc"] Jan 23 10:55:19 crc kubenswrapper[4957]: I0123 10:55:19.537745 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" podUID="6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" containerName="route-controller-manager" containerID="cri-o://c15537c2de03d6416a3916e1058fea58b2703f55ead63b91ae983eade45da094" gracePeriod=30 Jan 23 10:55:19 crc kubenswrapper[4957]: I0123 10:55:19.803539 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jhgc"] Jan 23 10:55:19 crc kubenswrapper[4957]: I0123 10:55:19.803795 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jhgc" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="registry-server" containerID="cri-o://5c563c47d91953b035693d5af537c298d6b7f9e067659d1171e81c81c27b91bf" gracePeriod=2 Jan 23 10:55:20 crc kubenswrapper[4957]: I0123 10:55:20.699798 4957 generic.go:334] "Generic (PLEG): container finished" podID="5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" containerID="006a101a7bfcb87a430da458e87a04fdf85da360b22196ba744994e050bb9fdb" exitCode=0 Jan 23 10:55:20 crc kubenswrapper[4957]: I0123 10:55:20.699891 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" event={"ID":"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479","Type":"ContainerDied","Data":"006a101a7bfcb87a430da458e87a04fdf85da360b22196ba744994e050bb9fdb"} Jan 23 10:55:20 crc kubenswrapper[4957]: I0123 10:55:20.701773 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"380c1560a9b50de2450e83ba786578aa9f30c79645bd086692c358cef7dcbaf6"} Jan 23 10:55:20 crc kubenswrapper[4957]: I0123 10:55:20.705114 4957 generic.go:334] "Generic (PLEG): container finished" podID="784bb395-54fe-47ab-9fd3-0298329d8566" containerID="5c563c47d91953b035693d5af537c298d6b7f9e067659d1171e81c81c27b91bf" exitCode=0 Jan 23 10:55:20 crc kubenswrapper[4957]: I0123 10:55:20.705169 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jhgc" event={"ID":"784bb395-54fe-47ab-9fd3-0298329d8566","Type":"ContainerDied","Data":"5c563c47d91953b035693d5af537c298d6b7f9e067659d1171e81c81c27b91bf"} Jan 23 10:55:20 crc kubenswrapper[4957]: I0123 10:55:20.712908 4957 generic.go:334] "Generic (PLEG): container finished" podID="6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" containerID="c15537c2de03d6416a3916e1058fea58b2703f55ead63b91ae983eade45da094" exitCode=0 Jan 23 10:55:20 crc kubenswrapper[4957]: I0123 10:55:20.712963 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" event={"ID":"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9","Type":"ContainerDied","Data":"c15537c2de03d6416a3916e1058fea58b2703f55ead63b91ae983eade45da094"} Jan 23 10:55:21 crc kubenswrapper[4957]: I0123 10:55:21.723141 4957 generic.go:334] "Generic (PLEG): container finished" podID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerID="d6752cb476d759d545ed0898d08ffad32d316c92e10f1bdd0feed559b300bbb3" exitCode=0 Jan 23 10:55:21 crc kubenswrapper[4957]: I0123 10:55:21.724243 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87khr" event={"ID":"24be1c19-2466-4e1e-9232-9697534b5d6e","Type":"ContainerDied","Data":"d6752cb476d759d545ed0898d08ffad32d316c92e10f1bdd0feed559b300bbb3"} Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.267307 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.293876 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp"] Jan 23 10:55:22 crc kubenswrapper[4957]: E0123 10:55:22.294171 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" containerName="route-controller-manager" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.294188 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" containerName="route-controller-manager" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.304979 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" containerName="route-controller-manager" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.305606 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.305866 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.364241 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.453783 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.461374 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-serving-cert\") pod \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.461490 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-config\") pod \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.461564 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-client-ca\") pod \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.461849 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-catalog-content\") pod \"24be1c19-2466-4e1e-9232-9697534b5d6e\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.461935 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-catalog-content\") pod \"784bb395-54fe-47ab-9fd3-0298329d8566\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.462061 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-utilities\") pod \"24be1c19-2466-4e1e-9232-9697534b5d6e\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.462111 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hfkc\" (UniqueName: \"kubernetes.io/projected/784bb395-54fe-47ab-9fd3-0298329d8566-kube-api-access-4hfkc\") pod \"784bb395-54fe-47ab-9fd3-0298329d8566\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.462184 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhdc5\" (UniqueName: \"kubernetes.io/projected/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-kube-api-access-nhdc5\") pod \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\" (UID: \"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.462505 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hht6r\" (UniqueName: \"kubernetes.io/projected/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-kube-api-access-hht6r\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.462663 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-config\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.462728 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-client-ca\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.462797 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-serving-cert\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.465491 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-utilities" (OuterVolumeSpecName: "utilities") pod "24be1c19-2466-4e1e-9232-9697534b5d6e" (UID: "24be1c19-2466-4e1e-9232-9697534b5d6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.467487 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-kube-api-access-nhdc5" (OuterVolumeSpecName: "kube-api-access-nhdc5") pod "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" (UID: "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9"). InnerVolumeSpecName "kube-api-access-nhdc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.469487 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784bb395-54fe-47ab-9fd3-0298329d8566-kube-api-access-4hfkc" (OuterVolumeSpecName: "kube-api-access-4hfkc") pod "784bb395-54fe-47ab-9fd3-0298329d8566" (UID: "784bb395-54fe-47ab-9fd3-0298329d8566"). InnerVolumeSpecName "kube-api-access-4hfkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.469914 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" (UID: "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.479916 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" (UID: "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.480047 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-config" (OuterVolumeSpecName: "config") pod "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" (UID: "6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.481625 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.499213 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "784bb395-54fe-47ab-9fd3-0298329d8566" (UID: "784bb395-54fe-47ab-9fd3-0298329d8566"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.553413 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24be1c19-2466-4e1e-9232-9697534b5d6e" (UID: "24be1c19-2466-4e1e-9232-9697534b5d6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563391 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-config\") pod \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563440 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-serving-cert\") pod \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563506 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-proxy-ca-bundles\") pod \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563534 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8cgz\" (UniqueName: \"kubernetes.io/projected/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-kube-api-access-f8cgz\") pod \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563554 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-utilities\") pod \"784bb395-54fe-47ab-9fd3-0298329d8566\" (UID: \"784bb395-54fe-47ab-9fd3-0298329d8566\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563573 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwt4b\" (UniqueName: \"kubernetes.io/projected/24be1c19-2466-4e1e-9232-9697534b5d6e-kube-api-access-nwt4b\") pod \"24be1c19-2466-4e1e-9232-9697534b5d6e\" (UID: \"24be1c19-2466-4e1e-9232-9697534b5d6e\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563591 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-client-ca\") pod \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\" (UID: \"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479\") " Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563685 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hht6r\" (UniqueName: \"kubernetes.io/projected/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-kube-api-access-hht6r\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563731 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-config\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563751 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-client-ca\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563775 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-serving-cert\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563852 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563865 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hfkc\" (UniqueName: \"kubernetes.io/projected/784bb395-54fe-47ab-9fd3-0298329d8566-kube-api-access-4hfkc\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563875 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhdc5\" (UniqueName: \"kubernetes.io/projected/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-kube-api-access-nhdc5\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563883 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563892 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563899 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563907 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24be1c19-2466-4e1e-9232-9697534b5d6e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.563916 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.564515 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-config" (OuterVolumeSpecName: "config") pod "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" (UID: "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.564864 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-client-ca" (OuterVolumeSpecName: "client-ca") pod "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" (UID: "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.566612 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-utilities" (OuterVolumeSpecName: "utilities") pod "784bb395-54fe-47ab-9fd3-0298329d8566" (UID: "784bb395-54fe-47ab-9fd3-0298329d8566"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.566850 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" (UID: "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.567277 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-serving-cert\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.567481 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-kube-api-access-f8cgz" (OuterVolumeSpecName: "kube-api-access-f8cgz") pod "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" (UID: "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479"). InnerVolumeSpecName "kube-api-access-f8cgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.568066 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-client-ca\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.569064 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-config\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.569674 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24be1c19-2466-4e1e-9232-9697534b5d6e-kube-api-access-nwt4b" (OuterVolumeSpecName: "kube-api-access-nwt4b") pod "24be1c19-2466-4e1e-9232-9697534b5d6e" (UID: "24be1c19-2466-4e1e-9232-9697534b5d6e"). InnerVolumeSpecName "kube-api-access-nwt4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.569987 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" (UID: "5ff23ec1-7d08-4d1b-bdd5-6e97a2662479"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.579657 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hht6r\" (UniqueName: \"kubernetes.io/projected/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-kube-api-access-hht6r\") pod \"route-controller-manager-c54fd9bc-zh7lp\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.664607 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.665156 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwt4b\" (UniqueName: \"kubernetes.io/projected/24be1c19-2466-4e1e-9232-9697534b5d6e-kube-api-access-nwt4b\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.665194 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.665209 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.665237 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.665245 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.665253 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8cgz\" (UniqueName: \"kubernetes.io/projected/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479-kube-api-access-f8cgz\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.665262 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784bb395-54fe-47ab-9fd3-0298329d8566-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.733413 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" event={"ID":"5ff23ec1-7d08-4d1b-bdd5-6e97a2662479","Type":"ContainerDied","Data":"2d79e277f36cc5d19b47913a8215a517dab1cb5e92d7c6a0242e3f656a5ce537"} Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.733459 4957 scope.go:117] "RemoveContainer" containerID="006a101a7bfcb87a430da458e87a04fdf85da360b22196ba744994e050bb9fdb" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.733549 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.740770 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87khr" event={"ID":"24be1c19-2466-4e1e-9232-9697534b5d6e","Type":"ContainerDied","Data":"bf3669c35d73fe0cc3020369027d3ad209c6dda235e4a23a24cb1cc0aa9447ef"} Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.740886 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87khr" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.744973 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5qpn" event={"ID":"c0641b36-685a-4625-93cb-a6159de3628e","Type":"ContainerStarted","Data":"9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78"} Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.751621 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7xvl" event={"ID":"4210122d-91b9-4890-a8d0-23e71c42d121","Type":"ContainerStarted","Data":"c1ffc150b76e56e77cbde75f64b521bc42e86e8378da5423d3a018421b520c69"} Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.768091 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f5qpn" podStartSLOduration=2.425501722 podStartE2EDuration="1m15.768075321s" podCreationTimestamp="2026-01-23 10:54:07 +0000 UTC" firstStartedPulling="2026-01-23 10:54:08.498886789 +0000 UTC m=+158.036139476" lastFinishedPulling="2026-01-23 10:55:21.841460388 +0000 UTC m=+231.378713075" observedRunningTime="2026-01-23 10:55:22.766598752 +0000 UTC m=+232.303851439" watchObservedRunningTime="2026-01-23 10:55:22.768075321 +0000 UTC m=+232.305328008" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.775654 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jhgc" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.792334 4957 scope.go:117] "RemoveContainer" containerID="d6752cb476d759d545ed0898d08ffad32d316c92e10f1bdd0feed559b300bbb3" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.804700 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.806244 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54ddc8bc77-xr4fw"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.806266 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jhgc" event={"ID":"784bb395-54fe-47ab-9fd3-0298329d8566","Type":"ContainerDied","Data":"8374321db367b7447922d4e6fd9cd26e78708d2cd6b9059804b69b4731434fed"} Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.811620 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l7xvl" podStartSLOduration=3.309142963 podStartE2EDuration="1m19.811605235s" podCreationTimestamp="2026-01-23 10:54:03 +0000 UTC" firstStartedPulling="2026-01-23 10:54:05.395472777 +0000 UTC m=+154.932725464" lastFinishedPulling="2026-01-23 10:55:21.897935049 +0000 UTC m=+231.435187736" observedRunningTime="2026-01-23 10:55:22.809861719 +0000 UTC m=+232.347114406" watchObservedRunningTime="2026-01-23 10:55:22.811605235 +0000 UTC m=+232.348857922" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.816974 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.840824 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc" event={"ID":"6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9","Type":"ContainerDied","Data":"f7770afb598658df5d1fe239be7228c6c0fbc062c5185c0eb6b9aeaaf93d1e30"} Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.848693 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jhgc"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.858835 4957 scope.go:117] "RemoveContainer" containerID="2b31fe528d36092a70294c93e7130464c754e6fb09964153ca51da5bb9c30679" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.864449 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jhgc"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.878496 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87khr"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.888331 4957 scope.go:117] "RemoveContainer" containerID="7615e0bd7b82a55e8ca5b1a82d1b3a24dde36d0d806f689bec9bf3a9ea3cf7e9" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.894604 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-87khr"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.903457 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.907797 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7486dcfd4c-szhlc"] Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.908677 4957 scope.go:117] "RemoveContainer" containerID="5c563c47d91953b035693d5af537c298d6b7f9e067659d1171e81c81c27b91bf" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.922622 4957 scope.go:117] "RemoveContainer" containerID="b4cf07f4860c65c89b13b7203a10a0a9fe192e40bbebb28b8ad7512ae527aafd" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.942049 4957 scope.go:117] "RemoveContainer" containerID="dbf96a68664a0fab9835c3fe7a05995dffe94ef88eabd3e62c2f3ba960e97f71" Jan 23 10:55:22 crc kubenswrapper[4957]: I0123 10:55:22.980598 4957 scope.go:117] "RemoveContainer" containerID="c15537c2de03d6416a3916e1058fea58b2703f55ead63b91ae983eade45da094" Jan 23 10:55:22 crc kubenswrapper[4957]: E0123 10:55:22.983386 4957 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e6bc4c7_f6a3_4e6f_82be_9a652d8e86d9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e6bc4c7_f6a3_4e6f_82be_9a652d8e86d9.slice/crio-f7770afb598658df5d1fe239be7228c6c0fbc062c5185c0eb6b9aeaaf93d1e30\": RecentStats: unable to find data in memory cache]" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.119322 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.563678 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wtmvm"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.833095 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" event={"ID":"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1","Type":"ContainerStarted","Data":"78f9c881642cb9ebe0ccdd114954d397ed6bb18b0060ac736c2c0224fbb92f60"} Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.833135 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" event={"ID":"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1","Type":"ContainerStarted","Data":"6d49c59e16cddd823b84aaa6fdad76811e85076757cb2d1212c22c54a396d132"} Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.834212 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.841241 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.873724 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" podStartSLOduration=4.873710582 podStartE2EDuration="4.873710582s" podCreationTimestamp="2026-01-23 10:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:55:23.858160157 +0000 UTC m=+233.395412844" watchObservedRunningTime="2026-01-23 10:55:23.873710582 +0000 UTC m=+233.410963269" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.929788 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zrcr"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.930034 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5zrcr" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="registry-server" containerID="cri-o://e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" gracePeriod=30 Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.934039 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7xvl"] Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.934369 4957 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" cmd=["grpc_health_probe","-addr=:50051"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.934476 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l7xvl" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="registry-server" containerID="cri-o://c1ffc150b76e56e77cbde75f64b521bc42e86e8378da5423d3a018421b520c69" gracePeriod=30 Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.936249 4957 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" cmd=["grpc_health_probe","-addr=:50051"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.940903 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cfk5"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.941089 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" podUID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" containerName="marketplace-operator" containerID="cri-o://5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed" gracePeriod=30 Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.946254 4957 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" cmd=["grpc_health_probe","-addr=:50051"] Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.946351 4957 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/certified-operators-5zrcr" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="registry-server" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.948875 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c452"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.949138 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2c452" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="registry-server" containerID="cri-o://af3967c7d4dec06fcfcebb91c59852179bacb2c68717d53b842c8fbe1af2ff7f" gracePeriod=30 Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.955298 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5qpn"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.955555 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f5qpn" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="registry-server" containerID="cri-o://9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78" gracePeriod=30 Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964206 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pq4t4"] Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.964431 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="extract-utilities" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964447 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="extract-utilities" Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.964458 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="extract-utilities" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964465 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="extract-utilities" Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.964471 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="extract-content" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964478 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="extract-content" Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.964493 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="registry-server" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964499 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="registry-server" Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.964510 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="extract-content" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964516 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="extract-content" Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.964525 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" containerName="controller-manager" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964531 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" containerName="controller-manager" Jan 23 10:55:23 crc kubenswrapper[4957]: E0123 10:55:23.964539 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="registry-server" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964545 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="registry-server" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964650 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" containerName="registry-server" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964662 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" containerName="controller-manager" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.964671 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" containerName="registry-server" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.965030 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.970394 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l54d2"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.970669 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l54d2" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="registry-server" containerID="cri-o://469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b" gracePeriod=30 Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.972636 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pq4t4"] Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.985411 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.985468 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdjhm\" (UniqueName: \"kubernetes.io/projected/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-kube-api-access-kdjhm\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:23 crc kubenswrapper[4957]: I0123 10:55:23.985493 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.083647 4957 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b is running failed: container process not found" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" cmd=["grpc_health_probe","-addr=:50051"] Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.084080 4957 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b is running failed: container process not found" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" cmd=["grpc_health_probe","-addr=:50051"] Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.084405 4957 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b is running failed: container process not found" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" cmd=["grpc_health_probe","-addr=:50051"] Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.084441 4957 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-5zrcr" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="registry-server" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.086079 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.086150 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.086189 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdjhm\" (UniqueName: \"kubernetes.io/projected/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-kube-api-access-kdjhm\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.087681 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.092246 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.109308 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdjhm\" (UniqueName: \"kubernetes.io/projected/564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8-kube-api-access-kdjhm\") pod \"marketplace-operator-79b997595-pq4t4\" (UID: \"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.281590 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.356830 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.398126 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.426016 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.526665 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.587463 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pq4t4"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.592504 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgcwn\" (UniqueName: \"kubernetes.io/projected/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-kube-api-access-qgcwn\") pod \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.592583 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-trusted-ca\") pod \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.592623 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-operator-metrics\") pod \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\" (UID: \"c951bf2d-f3fe-4a75-8e95-040c46cb1f01\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.592675 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5k4m\" (UniqueName: \"kubernetes.io/projected/fd5862ba-aabd-4c6f-b546-3f22d40592b8-kube-api-access-g5k4m\") pod \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.592702 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-catalog-content\") pod \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.592727 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-utilities\") pod \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\" (UID: \"fd5862ba-aabd-4c6f-b546-3f22d40592b8\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.593489 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c951bf2d-f3fe-4a75-8e95-040c46cb1f01" (UID: "c951bf2d-f3fe-4a75-8e95-040c46cb1f01"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.593568 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-utilities" (OuterVolumeSpecName: "utilities") pod "fd5862ba-aabd-4c6f-b546-3f22d40592b8" (UID: "fd5862ba-aabd-4c6f-b546-3f22d40592b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594205 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f6857c485-zm57k"] Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.594417 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="extract-content" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594434 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="extract-content" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.594445 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="extract-utilities" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594452 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="extract-utilities" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.594463 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="extract-content" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594470 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="extract-content" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.594482 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="registry-server" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594488 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="registry-server" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.594499 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="extract-utilities" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594505 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="extract-utilities" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.594518 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="registry-server" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594524 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="registry-server" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.594534 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" containerName="marketplace-operator" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594540 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" containerName="marketplace-operator" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594650 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerName="registry-server" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594666 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" containerName="marketplace-operator" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.594674 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerName="registry-server" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.595046 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.600701 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5862ba-aabd-4c6f-b546-3f22d40592b8-kube-api-access-g5k4m" (OuterVolumeSpecName: "kube-api-access-g5k4m") pod "fd5862ba-aabd-4c6f-b546-3f22d40592b8" (UID: "fd5862ba-aabd-4c6f-b546-3f22d40592b8"). InnerVolumeSpecName "kube-api-access-g5k4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.602045 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.602240 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.602245 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.602375 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.602459 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.604394 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-kube-api-access-qgcwn" (OuterVolumeSpecName: "kube-api-access-qgcwn") pod "c951bf2d-f3fe-4a75-8e95-040c46cb1f01" (UID: "c951bf2d-f3fe-4a75-8e95-040c46cb1f01"). InnerVolumeSpecName "kube-api-access-qgcwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.604844 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c951bf2d-f3fe-4a75-8e95-040c46cb1f01" (UID: "c951bf2d-f3fe-4a75-8e95-040c46cb1f01"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.608020 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f6857c485-zm57k"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.608126 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.628855 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.666788 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd5862ba-aabd-4c6f-b546-3f22d40592b8" (UID: "fd5862ba-aabd-4c6f-b546-3f22d40592b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694293 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-catalog-content\") pod \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694336 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-utilities\") pod \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694369 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-kube-api-access-km4jv\") pod \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\" (UID: \"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694534 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-serving-cert\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694600 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-config\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694619 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-client-ca\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694637 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wqdw\" (UniqueName: \"kubernetes.io/projected/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-kube-api-access-6wqdw\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694655 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-proxy-ca-bundles\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694693 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5k4m\" (UniqueName: \"kubernetes.io/projected/fd5862ba-aabd-4c6f-b546-3f22d40592b8-kube-api-access-g5k4m\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694704 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694714 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5862ba-aabd-4c6f-b546-3f22d40592b8-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694725 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgcwn\" (UniqueName: \"kubernetes.io/projected/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-kube-api-access-qgcwn\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694734 4957 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.694743 4957 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c951bf2d-f3fe-4a75-8e95-040c46cb1f01-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.695796 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-utilities" (OuterVolumeSpecName: "utilities") pod "e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" (UID: "e16b20f4-5d6a-4cf6-878a-3cf03afe72bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.698808 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-kube-api-access-km4jv" (OuterVolumeSpecName: "kube-api-access-km4jv") pod "e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" (UID: "e16b20f4-5d6a-4cf6-878a-3cf03afe72bb"). InnerVolumeSpecName "kube-api-access-km4jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.786196 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24be1c19-2466-4e1e-9232-9697534b5d6e" path="/var/lib/kubelet/pods/24be1c19-2466-4e1e-9232-9697534b5d6e/volumes" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.787860 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff23ec1-7d08-4d1b-bdd5-6e97a2662479" path="/var/lib/kubelet/pods/5ff23ec1-7d08-4d1b-bdd5-6e97a2662479/volumes" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.789866 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9" path="/var/lib/kubelet/pods/6e6bc4c7-f6a3-4e6f-82be-9a652d8e86d9/volumes" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.792683 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784bb395-54fe-47ab-9fd3-0298329d8566" path="/var/lib/kubelet/pods/784bb395-54fe-47ab-9fd3-0298329d8566/volumes" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.795944 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-config\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.795986 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-client-ca\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.796042 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wqdw\" (UniqueName: \"kubernetes.io/projected/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-kube-api-access-6wqdw\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.796129 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-proxy-ca-bundles\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.796336 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-serving-cert\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.796401 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.796414 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-kube-api-access-km4jv\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.800093 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-client-ca\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.800249 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-proxy-ca-bundles\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.801592 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-serving-cert\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.811482 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-config\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.813207 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wqdw\" (UniqueName: \"kubernetes.io/projected/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-kube-api-access-6wqdw\") pod \"controller-manager-f6857c485-zm57k\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.847264 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f5qpn_c0641b36-685a-4625-93cb-a6159de3628e/registry-server/0.log" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.847827 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.848449 4957 generic.go:334] "Generic (PLEG): container finished" podID="783873e0-bce9-4f05-849d-0fe265010d39" containerID="af3967c7d4dec06fcfcebb91c59852179bacb2c68717d53b842c8fbe1af2ff7f" exitCode=0 Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.848501 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c452" event={"ID":"783873e0-bce9-4f05-849d-0fe265010d39","Type":"ContainerDied","Data":"af3967c7d4dec06fcfcebb91c59852179bacb2c68717d53b842c8fbe1af2ff7f"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.849315 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" event={"ID":"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8","Type":"ContainerStarted","Data":"fecd745e66f5a456a7f1dde76d0b7e192351eef9b47101b46d048741f2648aed"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.856020 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f5qpn_c0641b36-685a-4625-93cb-a6159de3628e/registry-server/0.log" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.857335 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" (UID: "e16b20f4-5d6a-4cf6-878a-3cf03afe72bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.857674 4957 generic.go:334] "Generic (PLEG): container finished" podID="c0641b36-685a-4625-93cb-a6159de3628e" containerID="9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78" exitCode=1 Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.858049 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5qpn" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.858511 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5qpn" event={"ID":"c0641b36-685a-4625-93cb-a6159de3628e","Type":"ContainerDied","Data":"9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.858550 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5qpn" event={"ID":"c0641b36-685a-4625-93cb-a6159de3628e","Type":"ContainerDied","Data":"70c23d98a5a2dc19892529565de66f4100c86ed8948466d5606dff009e1f418a"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.858567 4957 scope.go:117] "RemoveContainer" containerID="9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.861182 4957 generic.go:334] "Generic (PLEG): container finished" podID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" containerID="5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed" exitCode=0 Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.861252 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.861229 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" event={"ID":"c951bf2d-f3fe-4a75-8e95-040c46cb1f01","Type":"ContainerDied","Data":"5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.864485 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cfk5" event={"ID":"c951bf2d-f3fe-4a75-8e95-040c46cb1f01","Type":"ContainerDied","Data":"5d8dba33398399fc101e8bbfab78cbb211bd0bcbd60f97616bce85d72e438a66"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.865086 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7xvl" event={"ID":"4210122d-91b9-4890-a8d0-23e71c42d121","Type":"ContainerDied","Data":"c1ffc150b76e56e77cbde75f64b521bc42e86e8378da5423d3a018421b520c69"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.865097 4957 generic.go:334] "Generic (PLEG): container finished" podID="4210122d-91b9-4890-a8d0-23e71c42d121" containerID="c1ffc150b76e56e77cbde75f64b521bc42e86e8378da5423d3a018421b520c69" exitCode=0 Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.881059 4957 scope.go:117] "RemoveContainer" containerID="fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.881444 4957 generic.go:334] "Generic (PLEG): container finished" podID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" containerID="469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b" exitCode=0 Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.881505 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l54d2" event={"ID":"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb","Type":"ContainerDied","Data":"469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.881529 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l54d2" event={"ID":"e16b20f4-5d6a-4cf6-878a-3cf03afe72bb","Type":"ContainerDied","Data":"8d56279619a1a1833f9d2e9ae863fc325299b3e581d8017e3e0b67b97c6b89e5"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.881558 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l54d2" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.887502 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cfk5"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.890000 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cfk5"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.890051 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrcr" event={"ID":"fd5862ba-aabd-4c6f-b546-3f22d40592b8","Type":"ContainerDied","Data":"e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.890022 4957 generic.go:334] "Generic (PLEG): container finished" podID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" exitCode=0 Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.890096 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrcr" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.890114 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrcr" event={"ID":"fd5862ba-aabd-4c6f-b546-3f22d40592b8","Type":"ContainerDied","Data":"b8fdf4baae8de0f9c652d6977b171226b4344032580c44c1b4d15d2101ad5a59"} Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.897117 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.899011 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.902898 4957 scope.go:117] "RemoveContainer" containerID="e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.912454 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l54d2"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.918224 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l54d2"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.922412 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zrcr"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.926960 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5zrcr"] Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.947215 4957 scope.go:117] "RemoveContainer" containerID="9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.947764 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78\": container with ID starting with 9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78 not found: ID does not exist" containerID="9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.947798 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78"} err="failed to get container status \"9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78\": rpc error: code = NotFound desc = could not find container \"9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78\": container with ID starting with 9971b9e4d92a531d2573870a602084c69753dd5e325ff9f72096b9b1b8cf7c78 not found: ID does not exist" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.947824 4957 scope.go:117] "RemoveContainer" containerID="fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.948220 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27\": container with ID starting with fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27 not found: ID does not exist" containerID="fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.948249 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27"} err="failed to get container status \"fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27\": rpc error: code = NotFound desc = could not find container \"fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27\": container with ID starting with fa9763f0015aaf67c85a8ac482befff393d16f1c4b9e459968fb6bd2d042ce27 not found: ID does not exist" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.948269 4957 scope.go:117] "RemoveContainer" containerID="e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.948524 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897\": container with ID starting with e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897 not found: ID does not exist" containerID="e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.948547 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897"} err="failed to get container status \"e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897\": rpc error: code = NotFound desc = could not find container \"e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897\": container with ID starting with e314ec50c604d1ef5d26912b9a516d332c00f59307f5992f89272c4f522eb897 not found: ID does not exist" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.948561 4957 scope.go:117] "RemoveContainer" containerID="5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.959876 4957 scope.go:117] "RemoveContainer" containerID="5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed" Jan 23 10:55:24 crc kubenswrapper[4957]: E0123 10:55:24.960238 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed\": container with ID starting with 5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed not found: ID does not exist" containerID="5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.960261 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed"} err="failed to get container status \"5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed\": rpc error: code = NotFound desc = could not find container \"5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed\": container with ID starting with 5f972fac2d5aa710602d4698dc2dacf3fedd45dd65a2153c1bdcd250ba5c72ed not found: ID does not exist" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.960292 4957 scope.go:117] "RemoveContainer" containerID="469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.971628 4957 scope.go:117] "RemoveContainer" containerID="3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.987869 4957 scope.go:117] "RemoveContainer" containerID="db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916" Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.998344 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-catalog-content\") pod \"c0641b36-685a-4625-93cb-a6159de3628e\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.998607 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4xnr\" (UniqueName: \"kubernetes.io/projected/783873e0-bce9-4f05-849d-0fe265010d39-kube-api-access-m4xnr\") pod \"783873e0-bce9-4f05-849d-0fe265010d39\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.998672 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-utilities\") pod \"783873e0-bce9-4f05-849d-0fe265010d39\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.998697 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx99p\" (UniqueName: \"kubernetes.io/projected/c0641b36-685a-4625-93cb-a6159de3628e-kube-api-access-vx99p\") pod \"c0641b36-685a-4625-93cb-a6159de3628e\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.998717 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-catalog-content\") pod \"783873e0-bce9-4f05-849d-0fe265010d39\" (UID: \"783873e0-bce9-4f05-849d-0fe265010d39\") " Jan 23 10:55:24 crc kubenswrapper[4957]: I0123 10:55:24.998740 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-utilities\") pod \"c0641b36-685a-4625-93cb-a6159de3628e\" (UID: \"c0641b36-685a-4625-93cb-a6159de3628e\") " Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.000098 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-utilities" (OuterVolumeSpecName: "utilities") pod "783873e0-bce9-4f05-849d-0fe265010d39" (UID: "783873e0-bce9-4f05-849d-0fe265010d39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.000791 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-utilities" (OuterVolumeSpecName: "utilities") pod "c0641b36-685a-4625-93cb-a6159de3628e" (UID: "c0641b36-685a-4625-93cb-a6159de3628e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.003314 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0641b36-685a-4625-93cb-a6159de3628e-kube-api-access-vx99p" (OuterVolumeSpecName: "kube-api-access-vx99p") pod "c0641b36-685a-4625-93cb-a6159de3628e" (UID: "c0641b36-685a-4625-93cb-a6159de3628e"). InnerVolumeSpecName "kube-api-access-vx99p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.003908 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783873e0-bce9-4f05-849d-0fe265010d39-kube-api-access-m4xnr" (OuterVolumeSpecName: "kube-api-access-m4xnr") pod "783873e0-bce9-4f05-849d-0fe265010d39" (UID: "783873e0-bce9-4f05-849d-0fe265010d39"). InnerVolumeSpecName "kube-api-access-m4xnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.004246 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.006087 4957 scope.go:117] "RemoveContainer" containerID="469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b" Jan 23 10:55:25 crc kubenswrapper[4957]: E0123 10:55:25.006482 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b\": container with ID starting with 469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b not found: ID does not exist" containerID="469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.006520 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b"} err="failed to get container status \"469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b\": rpc error: code = NotFound desc = could not find container \"469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b\": container with ID starting with 469d05b466a9a6a0f444264b7b6dfb1fd254375482721d913849b87999f43a7b not found: ID does not exist" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.006572 4957 scope.go:117] "RemoveContainer" containerID="3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4" Jan 23 10:55:25 crc kubenswrapper[4957]: E0123 10:55:25.007060 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4\": container with ID starting with 3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4 not found: ID does not exist" containerID="3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.007084 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4"} err="failed to get container status \"3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4\": rpc error: code = NotFound desc = could not find container \"3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4\": container with ID starting with 3184b37dc088a47df77688cebb1834390b0aa81bdcc42c13706e7ccddfcbdaf4 not found: ID does not exist" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.007121 4957 scope.go:117] "RemoveContainer" containerID="db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916" Jan 23 10:55:25 crc kubenswrapper[4957]: E0123 10:55:25.007547 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916\": container with ID starting with db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916 not found: ID does not exist" containerID="db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.007584 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916"} err="failed to get container status \"db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916\": rpc error: code = NotFound desc = could not find container \"db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916\": container with ID starting with db48b690b67c2ff8c0f31dbc1ad8da68672e03ca59cb0710c0be39be46ae5916 not found: ID does not exist" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.007633 4957 scope.go:117] "RemoveContainer" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.030117 4957 scope.go:117] "RemoveContainer" containerID="31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.030124 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "783873e0-bce9-4f05-849d-0fe265010d39" (UID: "783873e0-bce9-4f05-849d-0fe265010d39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.059186 4957 scope.go:117] "RemoveContainer" containerID="5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.088795 4957 scope.go:117] "RemoveContainer" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" Jan 23 10:55:25 crc kubenswrapper[4957]: E0123 10:55:25.091291 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b\": container with ID starting with e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b not found: ID does not exist" containerID="e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.091347 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b"} err="failed to get container status \"e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b\": rpc error: code = NotFound desc = could not find container \"e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b\": container with ID starting with e73dbca9c5d8267119099f1d2c0c6a216a518492d27f869717dfb27f094dfe6b not found: ID does not exist" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.091383 4957 scope.go:117] "RemoveContainer" containerID="31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab" Jan 23 10:55:25 crc kubenswrapper[4957]: E0123 10:55:25.091674 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab\": container with ID starting with 31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab not found: ID does not exist" containerID="31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.091705 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab"} err="failed to get container status \"31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab\": rpc error: code = NotFound desc = could not find container \"31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab\": container with ID starting with 31f7c645a56bc8605242d18d669194cd860c8507230e2b8c7992753c5479b4ab not found: ID does not exist" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.091731 4957 scope.go:117] "RemoveContainer" containerID="5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88" Jan 23 10:55:25 crc kubenswrapper[4957]: E0123 10:55:25.091910 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88\": container with ID starting with 5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88 not found: ID does not exist" containerID="5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.091932 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88"} err="failed to get container status \"5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88\": rpc error: code = NotFound desc = could not find container \"5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88\": container with ID starting with 5f2292dcf69571c22928f32e7e6f12fb20570c177c2684951a468bc0d34c8d88 not found: ID does not exist" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.100870 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.100934 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx99p\" (UniqueName: \"kubernetes.io/projected/c0641b36-685a-4625-93cb-a6159de3628e-kube-api-access-vx99p\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.100950 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783873e0-bce9-4f05-849d-0fe265010d39-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.100989 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.101001 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4xnr\" (UniqueName: \"kubernetes.io/projected/783873e0-bce9-4f05-849d-0fe265010d39-kube-api-access-m4xnr\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.204867 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0641b36-685a-4625-93cb-a6159de3628e" (UID: "c0641b36-685a-4625-93cb-a6159de3628e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.232189 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.306335 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0641b36-685a-4625-93cb-a6159de3628e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.407416 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnnbf\" (UniqueName: \"kubernetes.io/projected/4210122d-91b9-4890-a8d0-23e71c42d121-kube-api-access-xnnbf\") pod \"4210122d-91b9-4890-a8d0-23e71c42d121\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.407569 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-catalog-content\") pod \"4210122d-91b9-4890-a8d0-23e71c42d121\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.410402 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-utilities\") pod \"4210122d-91b9-4890-a8d0-23e71c42d121\" (UID: \"4210122d-91b9-4890-a8d0-23e71c42d121\") " Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.411162 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-utilities" (OuterVolumeSpecName: "utilities") pod "4210122d-91b9-4890-a8d0-23e71c42d121" (UID: "4210122d-91b9-4890-a8d0-23e71c42d121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.411433 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4210122d-91b9-4890-a8d0-23e71c42d121-kube-api-access-xnnbf" (OuterVolumeSpecName: "kube-api-access-xnnbf") pod "4210122d-91b9-4890-a8d0-23e71c42d121" (UID: "4210122d-91b9-4890-a8d0-23e71c42d121"). InnerVolumeSpecName "kube-api-access-xnnbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.468740 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f6857c485-zm57k"] Jan 23 10:55:25 crc kubenswrapper[4957]: W0123 10:55:25.474678 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b54be7c_8091_4cb4_bca2_8bb0691cd1ab.slice/crio-096f3ecb1a1f9e6b3a5fc96d43dbd0a2ec9925c8d68c1e055fb59cc16720974c WatchSource:0}: Error finding container 096f3ecb1a1f9e6b3a5fc96d43dbd0a2ec9925c8d68c1e055fb59cc16720974c: Status 404 returned error can't find the container with id 096f3ecb1a1f9e6b3a5fc96d43dbd0a2ec9925c8d68c1e055fb59cc16720974c Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.476460 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4210122d-91b9-4890-a8d0-23e71c42d121" (UID: "4210122d-91b9-4890-a8d0-23e71c42d121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.484146 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5qpn"] Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.503079 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f5qpn"] Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.512033 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnnbf\" (UniqueName: \"kubernetes.io/projected/4210122d-91b9-4890-a8d0-23e71c42d121-kube-api-access-xnnbf\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.512064 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.512076 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210122d-91b9-4890-a8d0-23e71c42d121-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.898186 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2c452" event={"ID":"783873e0-bce9-4f05-849d-0fe265010d39","Type":"ContainerDied","Data":"073ac0316bfb06567f325fbccf6bda2fef26ff25f51bf7bf248d8f27d1a69139"} Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.898233 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2c452" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.898237 4957 scope.go:117] "RemoveContainer" containerID="af3967c7d4dec06fcfcebb91c59852179bacb2c68717d53b842c8fbe1af2ff7f" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.899674 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" event={"ID":"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab","Type":"ContainerStarted","Data":"a932ded0eec0316dbcf394a6add196e5056c48b5d2a44b51bb89191816ed38b0"} Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.899709 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" event={"ID":"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab","Type":"ContainerStarted","Data":"096f3ecb1a1f9e6b3a5fc96d43dbd0a2ec9925c8d68c1e055fb59cc16720974c"} Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.899884 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.902643 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" event={"ID":"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8","Type":"ContainerStarted","Data":"28be68d35637eb2aa8b22156235862d7e6d0ad730ae9d8cf32dbc9b8c094e22a"} Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.904255 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.905626 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.916535 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.924079 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7xvl" Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.924105 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7xvl" event={"ID":"4210122d-91b9-4890-a8d0-23e71c42d121","Type":"ContainerDied","Data":"67e6131a830a5fad0d0cb1fb6f718d2306bc46a15bb47312ef6d13759ceb1548"} Jan 23 10:55:25 crc kubenswrapper[4957]: I0123 10:55:25.925348 4957 scope.go:117] "RemoveContainer" containerID="52369e67449d0e451f9f3a2943e0cce3d516df5029f90a17c23841d4d9715b85" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.102401 4957 scope.go:117] "RemoveContainer" containerID="bd860e2065ccae8389202d5b4fbc4f16bb131b9680fbe7cb3fcdd9c3cb491b50" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.125815 4957 scope.go:117] "RemoveContainer" containerID="c1ffc150b76e56e77cbde75f64b521bc42e86e8378da5423d3a018421b520c69" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.128701 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" podStartSLOduration=3.128674362 podStartE2EDuration="3.128674362s" podCreationTimestamp="2026-01-23 10:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:55:26.12297207 +0000 UTC m=+235.660224757" watchObservedRunningTime="2026-01-23 10:55:26.128674362 +0000 UTC m=+235.665927049" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.130435 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" podStartSLOduration=7.130424819 podStartE2EDuration="7.130424819s" podCreationTimestamp="2026-01-23 10:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:55:26.102456101 +0000 UTC m=+235.639708798" watchObservedRunningTime="2026-01-23 10:55:26.130424819 +0000 UTC m=+235.667677506" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.143361 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c452"] Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.152816 4957 scope.go:117] "RemoveContainer" containerID="cee261db6d544f87c37e4be56a806fcdc0605adb7511695e50fef22ad660c6af" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.155575 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2c452"] Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.172778 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7xvl"] Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.177739 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l7xvl"] Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.181108 4957 scope.go:117] "RemoveContainer" containerID="b11dcc9ce8fd4e67119d96225fda2b94e6a9eec65922e063ac911a339fe96636" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.609493 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mm4q6"] Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610025 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="extract-utilities" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610040 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="extract-utilities" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610052 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610059 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610072 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610080 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610089 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="extract-utilities" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610096 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="extract-utilities" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610110 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="extract-content" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610117 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="extract-content" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610128 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="extract-content" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610135 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="extract-content" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610145 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="extract-content" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610153 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="extract-content" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610161 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="extract-utilities" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610169 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="extract-utilities" Jan 23 10:55:26 crc kubenswrapper[4957]: E0123 10:55:26.610183 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610192 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610316 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610330 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0641b36-685a-4625-93cb-a6159de3628e" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.610340 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="783873e0-bce9-4f05-849d-0fe265010d39" containerName="registry-server" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.611232 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.613060 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.623997 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mm4q6"] Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.713224 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-catalog-content\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.713384 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-utilities\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.713442 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rc7x\" (UniqueName: \"kubernetes.io/projected/0aee337e-503a-46b7-8c0b-4e69a7618a9b-kube-api-access-2rc7x\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.778441 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4210122d-91b9-4890-a8d0-23e71c42d121" path="/var/lib/kubelet/pods/4210122d-91b9-4890-a8d0-23e71c42d121/volumes" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.779236 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783873e0-bce9-4f05-849d-0fe265010d39" path="/var/lib/kubelet/pods/783873e0-bce9-4f05-849d-0fe265010d39/volumes" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.779907 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0641b36-685a-4625-93cb-a6159de3628e" path="/var/lib/kubelet/pods/c0641b36-685a-4625-93cb-a6159de3628e/volumes" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.780633 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c951bf2d-f3fe-4a75-8e95-040c46cb1f01" path="/var/lib/kubelet/pods/c951bf2d-f3fe-4a75-8e95-040c46cb1f01/volumes" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.781134 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16b20f4-5d6a-4cf6-878a-3cf03afe72bb" path="/var/lib/kubelet/pods/e16b20f4-5d6a-4cf6-878a-3cf03afe72bb/volumes" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.781820 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5862ba-aabd-4c6f-b546-3f22d40592b8" path="/var/lib/kubelet/pods/fd5862ba-aabd-4c6f-b546-3f22d40592b8/volumes" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.815055 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-utilities\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.815141 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rc7x\" (UniqueName: \"kubernetes.io/projected/0aee337e-503a-46b7-8c0b-4e69a7618a9b-kube-api-access-2rc7x\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.815218 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-catalog-content\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.815768 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-catalog-content\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.816031 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-utilities\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.843112 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rc7x\" (UniqueName: \"kubernetes.io/projected/0aee337e-503a-46b7-8c0b-4e69a7618a9b-kube-api-access-2rc7x\") pod \"redhat-marketplace-mm4q6\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:26 crc kubenswrapper[4957]: I0123 10:55:26.930259 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.210818 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bsgcz"] Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.211800 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.215399 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.230080 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsgcz"] Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.424347 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrfj\" (UniqueName: \"kubernetes.io/projected/cd776179-759c-4fd9-987a-445a1d516d4c-kube-api-access-msrfj\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.424512 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776179-759c-4fd9-987a-445a1d516d4c-catalog-content\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.424649 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776179-759c-4fd9-987a-445a1d516d4c-utilities\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.526236 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776179-759c-4fd9-987a-445a1d516d4c-catalog-content\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.526524 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776179-759c-4fd9-987a-445a1d516d4c-utilities\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.526737 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrfj\" (UniqueName: \"kubernetes.io/projected/cd776179-759c-4fd9-987a-445a1d516d4c-kube-api-access-msrfj\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.526867 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776179-759c-4fd9-987a-445a1d516d4c-catalog-content\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.527179 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776179-759c-4fd9-987a-445a1d516d4c-utilities\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.543638 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrfj\" (UniqueName: \"kubernetes.io/projected/cd776179-759c-4fd9-987a-445a1d516d4c-kube-api-access-msrfj\") pod \"certified-operators-bsgcz\" (UID: \"cd776179-759c-4fd9-987a-445a1d516d4c\") " pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.763806 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.767903 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mm4q6"] Jan 23 10:55:27 crc kubenswrapper[4957]: W0123 10:55:27.782070 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aee337e_503a_46b7_8c0b_4e69a7618a9b.slice/crio-dd19865ae7c9b539b0d686bed3c5f5d0d2b4571c1f1c28001cd9d6754b2d1345 WatchSource:0}: Error finding container dd19865ae7c9b539b0d686bed3c5f5d0d2b4571c1f1c28001cd9d6754b2d1345: Status 404 returned error can't find the container with id dd19865ae7c9b539b0d686bed3c5f5d0d2b4571c1f1c28001cd9d6754b2d1345 Jan 23 10:55:27 crc kubenswrapper[4957]: I0123 10:55:27.943414 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm4q6" event={"ID":"0aee337e-503a-46b7-8c0b-4e69a7618a9b","Type":"ContainerStarted","Data":"dd19865ae7c9b539b0d686bed3c5f5d0d2b4571c1f1c28001cd9d6754b2d1345"} Jan 23 10:55:28 crc kubenswrapper[4957]: I0123 10:55:28.193485 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsgcz"] Jan 23 10:55:28 crc kubenswrapper[4957]: W0123 10:55:28.205175 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd776179_759c_4fd9_987a_445a1d516d4c.slice/crio-56859f235a183d5fd6a53bc9c96910fcf0596eaf1c0bce4bce58750d8676f6c7 WatchSource:0}: Error finding container 56859f235a183d5fd6a53bc9c96910fcf0596eaf1c0bce4bce58750d8676f6c7: Status 404 returned error can't find the container with id 56859f235a183d5fd6a53bc9c96910fcf0596eaf1c0bce4bce58750d8676f6c7 Jan 23 10:55:28 crc kubenswrapper[4957]: I0123 10:55:28.950203 4957 generic.go:334] "Generic (PLEG): container finished" podID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerID="a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801" exitCode=0 Jan 23 10:55:28 crc kubenswrapper[4957]: I0123 10:55:28.950294 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm4q6" event={"ID":"0aee337e-503a-46b7-8c0b-4e69a7618a9b","Type":"ContainerDied","Data":"a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801"} Jan 23 10:55:28 crc kubenswrapper[4957]: I0123 10:55:28.954051 4957 generic.go:334] "Generic (PLEG): container finished" podID="cd776179-759c-4fd9-987a-445a1d516d4c" containerID="441657049a287a01b440654ee775ca480d6518f7ec52204faaebbb9d9e6346aa" exitCode=0 Jan 23 10:55:28 crc kubenswrapper[4957]: I0123 10:55:28.954093 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsgcz" event={"ID":"cd776179-759c-4fd9-987a-445a1d516d4c","Type":"ContainerDied","Data":"441657049a287a01b440654ee775ca480d6518f7ec52204faaebbb9d9e6346aa"} Jan 23 10:55:28 crc kubenswrapper[4957]: I0123 10:55:28.954129 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsgcz" event={"ID":"cd776179-759c-4fd9-987a-445a1d516d4c","Type":"ContainerStarted","Data":"56859f235a183d5fd6a53bc9c96910fcf0596eaf1c0bce4bce58750d8676f6c7"} Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.009666 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xhhxg"] Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.011340 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.015870 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.019554 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhhxg"] Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.147209 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx87f\" (UniqueName: \"kubernetes.io/projected/9072b86d-252b-4804-ab47-be737d2a88ee-kube-api-access-xx87f\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.147314 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9072b86d-252b-4804-ab47-be737d2a88ee-utilities\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.147352 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9072b86d-252b-4804-ab47-be737d2a88ee-catalog-content\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.248855 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9072b86d-252b-4804-ab47-be737d2a88ee-utilities\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.248941 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9072b86d-252b-4804-ab47-be737d2a88ee-catalog-content\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.248994 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx87f\" (UniqueName: \"kubernetes.io/projected/9072b86d-252b-4804-ab47-be737d2a88ee-kube-api-access-xx87f\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.249712 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9072b86d-252b-4804-ab47-be737d2a88ee-utilities\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.249922 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9072b86d-252b-4804-ab47-be737d2a88ee-catalog-content\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.274100 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx87f\" (UniqueName: \"kubernetes.io/projected/9072b86d-252b-4804-ab47-be737d2a88ee-kube-api-access-xx87f\") pod \"redhat-operators-xhhxg\" (UID: \"9072b86d-252b-4804-ab47-be737d2a88ee\") " pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.336084 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.605448 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2p4vh"] Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.607048 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.610704 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.620721 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2p4vh"] Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.719244 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhhxg"] Jan 23 10:55:29 crc kubenswrapper[4957]: W0123 10:55:29.723366 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9072b86d_252b_4804_ab47_be737d2a88ee.slice/crio-4cdfa389e7609fdbb82d93bba20e3af01d599a621062b857f8f04da144dd88cb WatchSource:0}: Error finding container 4cdfa389e7609fdbb82d93bba20e3af01d599a621062b857f8f04da144dd88cb: Status 404 returned error can't find the container with id 4cdfa389e7609fdbb82d93bba20e3af01d599a621062b857f8f04da144dd88cb Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.755942 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dc559e-59e7-459c-9a4a-fb361bffad34-catalog-content\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.756022 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lknls\" (UniqueName: \"kubernetes.io/projected/d9dc559e-59e7-459c-9a4a-fb361bffad34-kube-api-access-lknls\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.756088 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dc559e-59e7-459c-9a4a-fb361bffad34-utilities\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.857362 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dc559e-59e7-459c-9a4a-fb361bffad34-catalog-content\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.857425 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lknls\" (UniqueName: \"kubernetes.io/projected/d9dc559e-59e7-459c-9a4a-fb361bffad34-kube-api-access-lknls\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.857459 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dc559e-59e7-459c-9a4a-fb361bffad34-utilities\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.857915 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dc559e-59e7-459c-9a4a-fb361bffad34-utilities\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.858131 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dc559e-59e7-459c-9a4a-fb361bffad34-catalog-content\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.881770 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lknls\" (UniqueName: \"kubernetes.io/projected/d9dc559e-59e7-459c-9a4a-fb361bffad34-kube-api-access-lknls\") pod \"community-operators-2p4vh\" (UID: \"d9dc559e-59e7-459c-9a4a-fb361bffad34\") " pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.940040 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.965102 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsgcz" event={"ID":"cd776179-759c-4fd9-987a-445a1d516d4c","Type":"ContainerStarted","Data":"1655c1fc89d14c5776bdfda6e411831a79af17e90d8deadc25120637fec6a1a1"} Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.978336 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm4q6" event={"ID":"0aee337e-503a-46b7-8c0b-4e69a7618a9b","Type":"ContainerStarted","Data":"41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b"} Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.981999 4957 generic.go:334] "Generic (PLEG): container finished" podID="9072b86d-252b-4804-ab47-be737d2a88ee" containerID="b5b39e6f4d8b36d12b1b436c4374abd3913855b70557b93d07a047468c6ca6ea" exitCode=0 Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.982056 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhhxg" event={"ID":"9072b86d-252b-4804-ab47-be737d2a88ee","Type":"ContainerDied","Data":"b5b39e6f4d8b36d12b1b436c4374abd3913855b70557b93d07a047468c6ca6ea"} Jan 23 10:55:29 crc kubenswrapper[4957]: I0123 10:55:29.982085 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhhxg" event={"ID":"9072b86d-252b-4804-ab47-be737d2a88ee","Type":"ContainerStarted","Data":"4cdfa389e7609fdbb82d93bba20e3af01d599a621062b857f8f04da144dd88cb"} Jan 23 10:55:30 crc kubenswrapper[4957]: I0123 10:55:30.357116 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2p4vh"] Jan 23 10:55:30 crc kubenswrapper[4957]: W0123 10:55:30.363397 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9dc559e_59e7_459c_9a4a_fb361bffad34.slice/crio-2c2e442af99630edafbbb495356854a2259d54ce8eab0dfdb964e2914f4925af WatchSource:0}: Error finding container 2c2e442af99630edafbbb495356854a2259d54ce8eab0dfdb964e2914f4925af: Status 404 returned error can't find the container with id 2c2e442af99630edafbbb495356854a2259d54ce8eab0dfdb964e2914f4925af Jan 23 10:55:30 crc kubenswrapper[4957]: I0123 10:55:30.988748 4957 generic.go:334] "Generic (PLEG): container finished" podID="d9dc559e-59e7-459c-9a4a-fb361bffad34" containerID="994952ac9a46198e8622905c2608953bf41896d149db33172701c1ecd37a53d8" exitCode=0 Jan 23 10:55:30 crc kubenswrapper[4957]: I0123 10:55:30.988851 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p4vh" event={"ID":"d9dc559e-59e7-459c-9a4a-fb361bffad34","Type":"ContainerDied","Data":"994952ac9a46198e8622905c2608953bf41896d149db33172701c1ecd37a53d8"} Jan 23 10:55:30 crc kubenswrapper[4957]: I0123 10:55:30.989355 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p4vh" event={"ID":"d9dc559e-59e7-459c-9a4a-fb361bffad34","Type":"ContainerStarted","Data":"2c2e442af99630edafbbb495356854a2259d54ce8eab0dfdb964e2914f4925af"} Jan 23 10:55:30 crc kubenswrapper[4957]: I0123 10:55:30.991119 4957 generic.go:334] "Generic (PLEG): container finished" podID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerID="41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b" exitCode=0 Jan 23 10:55:30 crc kubenswrapper[4957]: I0123 10:55:30.991173 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm4q6" event={"ID":"0aee337e-503a-46b7-8c0b-4e69a7618a9b","Type":"ContainerDied","Data":"41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b"} Jan 23 10:55:30 crc kubenswrapper[4957]: I0123 10:55:30.996686 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhhxg" event={"ID":"9072b86d-252b-4804-ab47-be737d2a88ee","Type":"ContainerStarted","Data":"d37fdb044704aeb4b4f67b572a66c30d98bbce435d2dddd57574b3a9d89ba58e"} Jan 23 10:55:31 crc kubenswrapper[4957]: I0123 10:55:31.000113 4957 generic.go:334] "Generic (PLEG): container finished" podID="cd776179-759c-4fd9-987a-445a1d516d4c" containerID="1655c1fc89d14c5776bdfda6e411831a79af17e90d8deadc25120637fec6a1a1" exitCode=0 Jan 23 10:55:31 crc kubenswrapper[4957]: I0123 10:55:31.000141 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsgcz" event={"ID":"cd776179-759c-4fd9-987a-445a1d516d4c","Type":"ContainerDied","Data":"1655c1fc89d14c5776bdfda6e411831a79af17e90d8deadc25120637fec6a1a1"} Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.008193 4957 generic.go:334] "Generic (PLEG): container finished" podID="9072b86d-252b-4804-ab47-be737d2a88ee" containerID="d37fdb044704aeb4b4f67b572a66c30d98bbce435d2dddd57574b3a9d89ba58e" exitCode=0 Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.008621 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhhxg" event={"ID":"9072b86d-252b-4804-ab47-be737d2a88ee","Type":"ContainerDied","Data":"d37fdb044704aeb4b4f67b572a66c30d98bbce435d2dddd57574b3a9d89ba58e"} Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.924385 4957 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.926194 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f" gracePeriod=15 Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.926485 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a" gracePeriod=15 Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.926589 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c" gracePeriod=15 Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.926740 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3" gracePeriod=15 Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.926911 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633" gracePeriod=15 Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.927213 4957 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 23 10:55:32 crc kubenswrapper[4957]: E0123 10:55:32.927630 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.927681 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 23 10:55:32 crc kubenswrapper[4957]: E0123 10:55:32.927696 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.927705 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 23 10:55:32 crc kubenswrapper[4957]: E0123 10:55:32.927757 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.927768 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 23 10:55:32 crc kubenswrapper[4957]: E0123 10:55:32.927784 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.927794 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 23 10:55:32 crc kubenswrapper[4957]: E0123 10:55:32.927807 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.927847 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 23 10:55:32 crc kubenswrapper[4957]: E0123 10:55:32.927856 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.927864 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.928058 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.928208 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.928229 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.928339 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.928362 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.928442 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 23 10:55:32 crc kubenswrapper[4957]: E0123 10:55:32.928627 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.928651 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.930160 4957 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.930807 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:32 crc kubenswrapper[4957]: I0123 10:55:32.947611 4957 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.021620 4957 generic.go:334] "Generic (PLEG): container finished" podID="d9dc559e-59e7-459c-9a4a-fb361bffad34" containerID="66ecfa88e97295ea3654ed5cc3fdf68b5eabe247986c31fbc5f76c4394e6a332" exitCode=0 Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.021698 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p4vh" event={"ID":"d9dc559e-59e7-459c-9a4a-fb361bffad34","Type":"ContainerDied","Data":"66ecfa88e97295ea3654ed5cc3fdf68b5eabe247986c31fbc5f76c4394e6a332"} Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.023225 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:33 crc kubenswrapper[4957]: E0123 10:55:33.025541 4957 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{community-operators-2p4vh.188d56dc59b15340 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-2p4vh,UID:d9dc559e-59e7-459c-9a4a-fb361bffad34,APIVersion:v1,ResourceVersion:29941,FieldPath:spec.containers{registry-server},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,LastTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.026514 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm4q6" event={"ID":"0aee337e-503a-46b7-8c0b-4e69a7618a9b","Type":"ContainerStarted","Data":"472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b"} Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.028555 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.028859 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.031604 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsgcz" event={"ID":"cd776179-759c-4fd9-987a-445a1d516d4c","Type":"ContainerStarted","Data":"f43c469996a2d0fa4c2c074e513adc4c94048c055191c43634062eb36ce34ce8"} Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.032265 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.037257 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.037808 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.105178 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.105720 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.105820 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.105955 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.106044 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.106145 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.106263 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.106430 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: E0123 10:55:33.170934 4957 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.207864 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.207914 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.207932 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.207979 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.207996 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208012 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208016 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208048 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208083 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208075 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208088 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208143 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208165 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208196 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208429 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:33 crc kubenswrapper[4957]: I0123 10:55:33.208809 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.040517 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.042119 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.043043 4957 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a" exitCode=0 Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.043067 4957 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c" exitCode=0 Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.043078 4957 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3" exitCode=0 Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.043088 4957 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633" exitCode=2 Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.043161 4957 scope.go:117] "RemoveContainer" containerID="5e837e02e63dbe59e7920302c0fb0b5c9165e96ebb684adadb02bacd61633214" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.046102 4957 generic.go:334] "Generic (PLEG): container finished" podID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" containerID="ffab0ed388b1fb0a4fd8da39cc0262d4ad82fac0ffd44d6e038643c06ff87b7f" exitCode=0 Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.046178 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d86df6f9-d655-44a7-a92e-8c7bf90d92af","Type":"ContainerDied","Data":"ffab0ed388b1fb0a4fd8da39cc0262d4ad82fac0ffd44d6e038643c06ff87b7f"} Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.046815 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.047047 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.047219 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.047581 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.050092 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2p4vh" event={"ID":"d9dc559e-59e7-459c-9a4a-fb361bffad34","Type":"ContainerStarted","Data":"8539f6b13c0486c6540956efa51c9cea209ee958ce2539dc31811ef3910f5861"} Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.050982 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.051392 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.051637 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.051946 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.052899 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhhxg" event={"ID":"9072b86d-252b-4804-ab47-be737d2a88ee","Type":"ContainerStarted","Data":"06ffc2d62cf277fd035c8e2ec137f24a1a8a4449cbfc3314fea7faeff3fb5b6e"} Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.053705 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.053966 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.054223 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.054527 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: I0123 10:55:34.054730 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:34 crc kubenswrapper[4957]: E0123 10:55:34.282024 4957 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{community-operators-2p4vh.188d56dc59b15340 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-2p4vh,UID:d9dc559e-59e7-459c-9a4a-fb361bffad34,APIVersion:v1,ResourceVersion:29941,FieldPath:spec.containers{registry-server},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,LastTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.065952 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.370443 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.370966 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.371531 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.371984 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.372267 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.372546 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.372731 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.439033 4957 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" volumeName="registry-storage" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.647509 4957 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.648043 4957 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.648585 4957 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.648890 4957 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.649147 4957 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.649170 4957 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.649398 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.689648 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.690166 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.690606 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.691092 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.691329 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.691582 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.691821 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.843624 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-var-lock\") pod \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.843971 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kube-api-access\") pod \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.844002 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kubelet-dir\") pod \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\" (UID: \"d86df6f9-d655-44a7-a92e-8c7bf90d92af\") " Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.843744 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-var-lock" (OuterVolumeSpecName: "var-lock") pod "d86df6f9-d655-44a7-a92e-8c7bf90d92af" (UID: "d86df6f9-d655-44a7-a92e-8c7bf90d92af"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.844347 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d86df6f9-d655-44a7-a92e-8c7bf90d92af" (UID: "d86df6f9-d655-44a7-a92e-8c7bf90d92af"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.849956 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d86df6f9-d655-44a7-a92e-8c7bf90d92af" (UID: "d86df6f9-d655-44a7-a92e-8c7bf90d92af"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:35 crc kubenswrapper[4957]: E0123 10:55:35.850009 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.946930 4957 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-var-lock\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.947076 4957 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:35 crc kubenswrapper[4957]: I0123 10:55:35.947085 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d86df6f9-d655-44a7-a92e-8c7bf90d92af-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.076344 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d86df6f9-d655-44a7-a92e-8c7bf90d92af","Type":"ContainerDied","Data":"817728bd954477d298fbc9cc4f5c63d0c743cff26a7200bd59a37f5cdb1eb6bc"} Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.076405 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.076420 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="817728bd954477d298fbc9cc4f5c63d0c743cff26a7200bd59a37f5cdb1eb6bc" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.097229 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.097801 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.098112 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.098419 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.098697 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.099035 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: E0123 10:55:36.250967 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.930558 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.930813 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.971949 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.972627 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.973031 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.973523 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.973794 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.974124 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:36 crc kubenswrapper[4957]: I0123 10:55:36.974387 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: E0123 10:55:37.052428 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.146038 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.146600 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.146951 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.147135 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.147345 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.147537 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.147715 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.764212 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.764510 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.818405 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.818912 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.819201 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.819571 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.819788 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.820005 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.820215 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:37 crc kubenswrapper[4957]: E0123 10:55:37.997387 4957 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:37 crc kubenswrapper[4957]: I0123 10:55:37.997917 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:38 crc kubenswrapper[4957]: W0123 10:55:38.016549 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-81ba60bdbdb4423edae6133112cdb82b58034bdf9fbdd82979166279c0895cb4 WatchSource:0}: Error finding container 81ba60bdbdb4423edae6133112cdb82b58034bdf9fbdd82979166279c0895cb4: Status 404 returned error can't find the container with id 81ba60bdbdb4423edae6133112cdb82b58034bdf9fbdd82979166279c0895cb4 Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.096255 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"81ba60bdbdb4423edae6133112cdb82b58034bdf9fbdd82979166279c0895cb4"} Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.143767 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bsgcz" Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.144638 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.145073 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.145449 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.145828 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.146061 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:38 crc kubenswrapper[4957]: I0123 10:55:38.146945 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:38 crc kubenswrapper[4957]: E0123 10:55:38.652756 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.102712 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.337262 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.337753 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.379063 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.379642 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.379974 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.380161 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.380376 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.380566 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.380749 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.941605 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:39 crc kubenswrapper[4957]: I0123 10:55:39.941670 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.002844 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.003605 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.004063 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.004594 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.004923 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.005412 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.005840 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.773037 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.773654 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.773993 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.774255 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.774542 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:40 crc kubenswrapper[4957]: I0123 10:55:40.774877 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:41 crc kubenswrapper[4957]: E0123 10:55:41.854405 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="6.4s" Jan 23 10:55:43 crc kubenswrapper[4957]: I0123 10:55:43.372766 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver/0.log" Jan 23 10:55:43 crc kubenswrapper[4957]: I0123 10:55:43.374213 4957 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f" exitCode=-1 Jan 23 10:55:44 crc kubenswrapper[4957]: E0123 10:55:44.283251 4957 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{community-operators-2p4vh.188d56dc59b15340 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-2p4vh,UID:d9dc559e-59e7-459c-9a4a-fb361bffad34,APIVersion:v1,ResourceVersion:29941,FieldPath:spec.containers{registry-server},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,LastTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.379874 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b2d7fd59cbbf3cb5b7c203a5d05b8645d2bb0424a0a049225fd52a21950e9c3b"} Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.380263 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: E0123 10:55:44.380344 4957 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.380476 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.380622 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.380774 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.380982 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.381201 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.383366 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.385005 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="309f8eeef51d67edae6da19713d705bab2cfe91a537194274be5258beebc8e17" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.403922 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.404613 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.405218 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.405440 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.405635 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.405840 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.406029 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.406220 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.406446 4957 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.424843 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2p4vh" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.425587 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.427598 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.428149 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.428539 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.428572 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xhhxg" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.428838 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.429063 4957 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.429365 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.429690 4957 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.429926 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.430144 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.430431 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.430749 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.431000 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.431257 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.460843 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.460907 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.460965 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.461002 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.461059 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.461155 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.461407 4957 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.461428 4957 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.461439 4957 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:44 crc kubenswrapper[4957]: I0123 10:55:44.776621 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.390822 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:45 crc kubenswrapper[4957]: E0123 10:55:45.391603 4957 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.391946 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.392431 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.392912 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.393481 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.393972 4957 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.394441 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.395081 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.396329 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.396836 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.398220 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.398969 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.399483 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.399880 4957 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:45 crc kubenswrapper[4957]: I0123 10:55:45.400608 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:47 crc kubenswrapper[4957]: I0123 10:55:47.787551 4957 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 23 10:55:47 crc kubenswrapper[4957]: I0123 10:55:47.787604 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 23 10:55:48 crc kubenswrapper[4957]: E0123 10:55:48.255416 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="7s" Jan 23 10:55:48 crc kubenswrapper[4957]: I0123 10:55:48.592849 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" containerName="oauth-openshift" containerID="cri-o://c2b3953fa767fab028d1ccb214f96f563a30df59f1b0977cb8145ea38261e6d7" gracePeriod=15 Jan 23 10:55:49 crc kubenswrapper[4957]: I0123 10:55:49.851588 4957 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 23 10:55:49 crc kubenswrapper[4957]: I0123 10:55:49.852050 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 23 10:55:50 crc kubenswrapper[4957]: I0123 10:55:50.771452 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:50 crc kubenswrapper[4957]: I0123 10:55:50.771989 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:50 crc kubenswrapper[4957]: I0123 10:55:50.772519 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:50 crc kubenswrapper[4957]: I0123 10:55:50.772953 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:50 crc kubenswrapper[4957]: I0123 10:55:50.774051 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:50 crc kubenswrapper[4957]: I0123 10:55:50.774460 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.426934 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.427240 4957 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf" exitCode=1 Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.427311 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf"} Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.427979 4957 scope.go:117] "RemoveContainer" containerID="c4d9f270c80ebedc7d79510e2f421e23789483dce954f5e1469469703660febf" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.428478 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.429006 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.429924 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.430437 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.430913 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.431231 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.431436 4957 generic.go:334] "Generic (PLEG): container finished" podID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" containerID="c2b3953fa767fab028d1ccb214f96f563a30df59f1b0977cb8145ea38261e6d7" exitCode=0 Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.431457 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:51 crc kubenswrapper[4957]: I0123 10:55:51.431503 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" event={"ID":"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d","Type":"ContainerDied","Data":"c2b3953fa767fab028d1ccb214f96f563a30df59f1b0977cb8145ea38261e6d7"} Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:51.999898 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.000777 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.001009 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.001633 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.001999 4957 status_manager.go:851] "Failed to get status for pod" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wtmvm\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.002607 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.002862 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.003154 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.003530 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067657 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-cliconfig\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067705 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-login\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067724 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-session\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067773 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-provider-selection\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067804 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-trusted-ca-bundle\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067824 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-policies\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067843 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-ocp-branding-template\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067863 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-dir\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067884 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-idp-0-file-data\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067907 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-router-certs\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067931 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-service-ca\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067951 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-error\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.067994 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-serving-cert\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.068027 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klbjf\" (UniqueName: \"kubernetes.io/projected/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-kube-api-access-klbjf\") pod \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\" (UID: \"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d\") " Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.069405 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.069506 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.069587 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.069905 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.070421 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.078777 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.079235 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.079580 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.079952 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-kube-api-access-klbjf" (OuterVolumeSpecName: "kube-api-access-klbjf") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "kube-api-access-klbjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.080269 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.080765 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.081085 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.081222 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.081595 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" (UID: "dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169555 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169587 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169597 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169609 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klbjf\" (UniqueName: \"kubernetes.io/projected/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-kube-api-access-klbjf\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169619 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169629 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169647 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169659 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169670 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169680 4957 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169689 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169698 4957 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169706 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.169718 4957 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.440208 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.440309 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4c2f4d959c1c11e79bf4402475297dcc6d12dc9e60f0bf1694771536d2111665"} Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.440939 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.441397 4957 status_manager.go:851] "Failed to get status for pod" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wtmvm\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.441708 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.441992 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.442239 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" event={"ID":"dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d","Type":"ContainerDied","Data":"0aa13813f27e6f34a353f611b50a2b5b75dbd5c175051469e0ec01cea287a791"} Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.442308 4957 scope.go:117] "RemoveContainer" containerID="c2b3953fa767fab028d1ccb214f96f563a30df59f1b0977cb8145ea38261e6d7" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.442340 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.442363 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.442593 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.442801 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.442988 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.443273 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.443506 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.443733 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.443932 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.444116 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.444304 4957 status_manager.go:851] "Failed to get status for pod" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wtmvm\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.444744 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.445024 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.455449 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.455654 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.455824 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.455986 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.456126 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.456255 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.456425 4957 status_manager.go:851] "Failed to get status for pod" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wtmvm\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: I0123 10:55:52.456569 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:52 crc kubenswrapper[4957]: E0123 10:55:52.775877 4957 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" volumeName="registry-storage" Jan 23 10:55:54 crc kubenswrapper[4957]: E0123 10:55:54.284234 4957 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{community-operators-2p4vh.188d56dc59b15340 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-2p4vh,UID:d9dc559e-59e7-459c-9a4a-fb361bffad34,APIVersion:v1,ResourceVersion:29941,FieldPath:spec.containers{registry-server},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,LastTimestamp:2026-01-23 10:55:33.024060224 +0000 UTC m=+242.561312911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.978440 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.984102 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.984648 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.984854 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.985046 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.985228 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.985444 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.985634 4957 status_manager.go:851] "Failed to get status for pod" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wtmvm\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.985822 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:54 crc kubenswrapper[4957]: I0123 10:55:54.986003 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:55 crc kubenswrapper[4957]: E0123 10:55:55.256988 4957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="7s" Jan 23 10:55:55 crc kubenswrapper[4957]: I0123 10:55:55.466953 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.769153 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.770056 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.770470 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.770838 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.771249 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.771872 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.772305 4957 status_manager.go:851] "Failed to get status for pod" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wtmvm\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.772676 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.773324 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.783378 4957 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.783404 4957 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:55:57 crc kubenswrapper[4957]: E0123 10:55:57.783915 4957 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:57 crc kubenswrapper[4957]: I0123 10:55:57.784587 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:58 crc kubenswrapper[4957]: I0123 10:55:58.487038 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f3db0269d2334d320d575443551e92e6dec0827fe79018171f131099ce58c318"} Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.494915 4957 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8dafb6a49748e3c07cf916917471a30c72955865ddad6ed6f5108cc7d43fd536" exitCode=0 Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.494962 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8dafb6a49748e3c07cf916917471a30c72955865ddad6ed6f5108cc7d43fd536"} Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.496849 4957 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.496912 4957 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.498015 4957 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:59 crc kubenswrapper[4957]: E0123 10:55:59.498482 4957 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.498781 4957 status_manager.go:851] "Failed to get status for pod" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.499159 4957 status_manager.go:851] "Failed to get status for pod" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" pod="openshift-marketplace/redhat-marketplace-mm4q6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mm4q6\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.499477 4957 status_manager.go:851] "Failed to get status for pod" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" pod="openshift-authentication/oauth-openshift-558db77b4-wtmvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wtmvm\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.499728 4957 status_manager.go:851] "Failed to get status for pod" podUID="cd776179-759c-4fd9-987a-445a1d516d4c" pod="openshift-marketplace/certified-operators-bsgcz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bsgcz\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.499999 4957 status_manager.go:851] "Failed to get status for pod" podUID="d9dc559e-59e7-459c-9a4a-fb361bffad34" pod="openshift-marketplace/community-operators-2p4vh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2p4vh\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.500434 4957 status_manager.go:851] "Failed to get status for pod" podUID="9072b86d-252b-4804-ab47-be737d2a88ee" pod="openshift-marketplace/redhat-operators-xhhxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-xhhxg\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:55:59 crc kubenswrapper[4957]: I0123 10:55:59.500677 4957 status_manager.go:851] "Failed to get status for pod" podUID="4856b826-759f-4132-bfc5-d80385771e22" pod="openshift-image-registry/image-registry-66df7c8f76-p5lm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-p5lm9\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 23 10:56:00 crc kubenswrapper[4957]: I0123 10:56:00.506210 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"404d9fd714f1c11a3fbb269cb2eb8d81d53ce914820624227361eca05fdc9bc6"} Jan 23 10:56:00 crc kubenswrapper[4957]: I0123 10:56:00.506561 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77e2eeae224f37ff48d2bcf73d3daeab37e2499f6fd9afc1c11de2aadc8aa97c"} Jan 23 10:56:01 crc kubenswrapper[4957]: I0123 10:56:01.513916 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3ed4cd7f2bc02fbf0987364df9b1fcb2c64ddf412c7b614e0e3c48dc72af4328"} Jan 23 10:56:01 crc kubenswrapper[4957]: I0123 10:56:01.515066 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"91b87a6597edd2f5dee700daad6c0ec0bbe72fb16d5b1cc6adda909e16e653e2"} Jan 23 10:56:03 crc kubenswrapper[4957]: I0123 10:56:03.526984 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3bc7a02036dc2a62bc455f5d3101cb655f3d527d5f293547fa91b50b3762416a"} Jan 23 10:56:03 crc kubenswrapper[4957]: I0123 10:56:03.527456 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:56:03 crc kubenswrapper[4957]: I0123 10:56:03.527371 4957 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:56:03 crc kubenswrapper[4957]: I0123 10:56:03.527493 4957 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:56:03 crc kubenswrapper[4957]: I0123 10:56:03.534549 4957 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:56:04 crc kubenswrapper[4957]: I0123 10:56:04.531221 4957 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:56:04 crc kubenswrapper[4957]: I0123 10:56:04.531541 4957 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:56:06 crc kubenswrapper[4957]: I0123 10:56:06.456550 4957 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7dab7f2d-e792-46a4-96c2-edc725935ca9" Jan 23 10:56:07 crc kubenswrapper[4957]: I0123 10:56:07.791410 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 23 10:56:30 crc kubenswrapper[4957]: I0123 10:56:30.630176 4957 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 23 10:56:31 crc kubenswrapper[4957]: I0123 10:56:31.308242 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 23 10:56:32 crc kubenswrapper[4957]: I0123 10:56:32.822267 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 23 10:56:34 crc kubenswrapper[4957]: I0123 10:56:34.484431 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 23 10:56:34 crc kubenswrapper[4957]: I0123 10:56:34.493904 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 23 10:56:34 crc kubenswrapper[4957]: I0123 10:56:34.860773 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 23 10:56:35 crc kubenswrapper[4957]: I0123 10:56:35.414347 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 23 10:56:35 crc kubenswrapper[4957]: I0123 10:56:35.511424 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 23 10:56:35 crc kubenswrapper[4957]: I0123 10:56:35.764452 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 23 10:56:36 crc kubenswrapper[4957]: I0123 10:56:36.018736 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 23 10:56:36 crc kubenswrapper[4957]: I0123 10:56:36.053685 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 23 10:56:36 crc kubenswrapper[4957]: I0123 10:56:36.529044 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 23 10:56:36 crc kubenswrapper[4957]: I0123 10:56:36.589301 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 23 10:56:36 crc kubenswrapper[4957]: I0123 10:56:36.791689 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 23 10:56:37 crc kubenswrapper[4957]: I0123 10:56:37.173138 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 23 10:56:37 crc kubenswrapper[4957]: I0123 10:56:37.302503 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 23 10:56:37 crc kubenswrapper[4957]: I0123 10:56:37.493332 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 23 10:56:37 crc kubenswrapper[4957]: I0123 10:56:37.590453 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 23 10:56:37 crc kubenswrapper[4957]: I0123 10:56:37.610468 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 23 10:56:37 crc kubenswrapper[4957]: I0123 10:56:37.765689 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 23 10:56:37 crc kubenswrapper[4957]: I0123 10:56:37.785923 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.186559 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.235239 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.306904 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.330307 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.505610 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.630971 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.706221 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 23 10:56:38 crc kubenswrapper[4957]: I0123 10:56:38.919337 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 23 10:56:39 crc kubenswrapper[4957]: I0123 10:56:39.028210 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 23 10:56:39 crc kubenswrapper[4957]: I0123 10:56:39.330187 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 23 10:56:39 crc kubenswrapper[4957]: I0123 10:56:39.372360 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 23 10:56:39 crc kubenswrapper[4957]: I0123 10:56:39.425902 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 23 10:56:39 crc kubenswrapper[4957]: I0123 10:56:39.719175 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 23 10:56:39 crc kubenswrapper[4957]: I0123 10:56:39.984960 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 23 10:56:40 crc kubenswrapper[4957]: I0123 10:56:40.165294 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 23 10:56:40 crc kubenswrapper[4957]: I0123 10:56:40.330178 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 23 10:56:40 crc kubenswrapper[4957]: I0123 10:56:40.456062 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 23 10:56:40 crc kubenswrapper[4957]: I0123 10:56:40.490505 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 23 10:56:40 crc kubenswrapper[4957]: I0123 10:56:40.683415 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 23 10:56:40 crc kubenswrapper[4957]: I0123 10:56:40.916982 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 23 10:56:40 crc kubenswrapper[4957]: I0123 10:56:40.941978 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.065967 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.175482 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.222632 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.393131 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.582508 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.593665 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.620184 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.700564 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.795843 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.861829 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.863877 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 23 10:56:41 crc kubenswrapper[4957]: I0123 10:56:41.957377 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.014780 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.030535 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.106187 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.164234 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.440536 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.613590 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.687719 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.744383 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.798110 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.809305 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.881501 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.945355 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.947244 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 23 10:56:42 crc kubenswrapper[4957]: I0123 10:56:42.990860 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.109853 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.297762 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.314887 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.469658 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.521868 4957 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.630762 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.819412 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 23 10:56:43 crc kubenswrapper[4957]: I0123 10:56:43.906930 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 23 10:56:44 crc kubenswrapper[4957]: I0123 10:56:44.007713 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 23 10:56:44 crc kubenswrapper[4957]: I0123 10:56:44.225256 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 23 10:56:44 crc kubenswrapper[4957]: I0123 10:56:44.445735 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 23 10:56:44 crc kubenswrapper[4957]: I0123 10:56:44.624529 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.119316 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.187039 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.195677 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.337563 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.341479 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.646846 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.823871 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.824155 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.855415 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.863849 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.871892 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.878960 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.925247 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 23 10:56:45 crc kubenswrapper[4957]: I0123 10:56:45.952582 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.059829 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.100094 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.144358 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.192772 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.270632 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.287810 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.691192 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.699395 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 23 10:56:46 crc kubenswrapper[4957]: I0123 10:56:46.959552 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.174146 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.204744 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.266944 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.357353 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.396789 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.414478 4957 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.419446 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xhhxg" podStartSLOduration=76.569733382 podStartE2EDuration="1m19.41942297s" podCreationTimestamp="2026-01-23 10:55:28 +0000 UTC" firstStartedPulling="2026-01-23 10:55:30.006028384 +0000 UTC m=+239.543281071" lastFinishedPulling="2026-01-23 10:55:32.855717972 +0000 UTC m=+242.392970659" observedRunningTime="2026-01-23 10:56:06.493611097 +0000 UTC m=+276.030863784" watchObservedRunningTime="2026-01-23 10:56:47.41942297 +0000 UTC m=+316.956675777" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.421698 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2p4vh" podStartSLOduration=76.004473695 podStartE2EDuration="1m18.421675875s" podCreationTimestamp="2026-01-23 10:55:29 +0000 UTC" firstStartedPulling="2026-01-23 10:55:30.99030979 +0000 UTC m=+240.527562467" lastFinishedPulling="2026-01-23 10:55:33.40751196 +0000 UTC m=+242.944764647" observedRunningTime="2026-01-23 10:56:06.476443392 +0000 UTC m=+276.013696069" watchObservedRunningTime="2026-01-23 10:56:47.421675875 +0000 UTC m=+316.958928562" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.422050 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bsgcz" podStartSLOduration=77.679973668 podStartE2EDuration="1m20.422044306s" podCreationTimestamp="2026-01-23 10:55:27 +0000 UTC" firstStartedPulling="2026-01-23 10:55:28.957846751 +0000 UTC m=+238.495099438" lastFinishedPulling="2026-01-23 10:55:31.699917379 +0000 UTC m=+241.237170076" observedRunningTime="2026-01-23 10:56:06.453759006 +0000 UTC m=+275.991011693" watchObservedRunningTime="2026-01-23 10:56:47.422044306 +0000 UTC m=+316.959296993" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.422708 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mm4q6" podStartSLOduration=78.600406771 podStartE2EDuration="1m21.422703484s" podCreationTimestamp="2026-01-23 10:55:26 +0000 UTC" firstStartedPulling="2026-01-23 10:55:28.952175979 +0000 UTC m=+238.489428676" lastFinishedPulling="2026-01-23 10:55:31.774472702 +0000 UTC m=+241.311725389" observedRunningTime="2026-01-23 10:56:06.439844684 +0000 UTC m=+275.977097371" watchObservedRunningTime="2026-01-23 10:56:47.422703484 +0000 UTC m=+316.959956171" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.423266 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-wtmvm"] Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.423335 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p"] Jan 23 10:56:47 crc kubenswrapper[4957]: E0123 10:56:47.423560 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" containerName="oauth-openshift" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.423578 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" containerName="oauth-openshift" Jan 23 10:56:47 crc kubenswrapper[4957]: E0123 10:56:47.423592 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" containerName="installer" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.423599 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" containerName="installer" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.423735 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" containerName="oauth-openshift" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.423751 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="d86df6f9-d655-44a7-a92e-8c7bf90d92af" containerName="installer" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.424152 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hgc64"] Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.424268 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.426340 4957 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.426384 4957 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea507738-b425-4366-808b-3a47317e66d0" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.428123 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.432012 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.434144 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.434145 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.434247 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.434376 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.434496 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.439132 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.439168 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.440601 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.442357 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.442499 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.442455 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.445848 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.447944 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.451911 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.451980 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452037 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452076 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452135 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d80894e-00e6-4d20-9ce3-398bb8f27e66-audit-dir\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452243 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452286 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452315 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452345 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-session\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452385 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-audit-policies\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452409 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dqqw\" (UniqueName: \"kubernetes.io/projected/3d80894e-00e6-4d20-9ce3-398bb8f27e66-kube-api-access-5dqqw\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452438 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452463 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.452490 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.454667 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.468481 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.477801 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=44.477775336 podStartE2EDuration="44.477775336s" podCreationTimestamp="2026-01-23 10:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:56:47.474598964 +0000 UTC m=+317.011851681" watchObservedRunningTime="2026-01-23 10:56:47.477775336 +0000 UTC m=+317.015028023" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.549914 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553140 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553181 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d80894e-00e6-4d20-9ce3-398bb8f27e66-audit-dir\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553209 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553231 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553251 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553271 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-session\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553329 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-audit-policies\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553353 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dqqw\" (UniqueName: \"kubernetes.io/projected/3d80894e-00e6-4d20-9ce3-398bb8f27e66-kube-api-access-5dqqw\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553386 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553414 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553444 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553469 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553467 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d80894e-00e6-4d20-9ce3-398bb8f27e66-audit-dir\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553488 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.553615 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.554754 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.555035 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.556016 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-audit-policies\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.556787 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.559999 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-session\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.560108 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.560136 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.560835 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.566891 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.590519 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.590615 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.590657 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.591055 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3d80894e-00e6-4d20-9ce3-398bb8f27e66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.598501 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dqqw\" (UniqueName: \"kubernetes.io/projected/3d80894e-00e6-4d20-9ce3-398bb8f27e66-kube-api-access-5dqqw\") pod \"oauth-openshift-7f8484fbcc-vbk8p\" (UID: \"3d80894e-00e6-4d20-9ce3-398bb8f27e66\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.600323 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.733247 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.752071 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.785586 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.785898 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.791378 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.810196 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.863239 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.897860 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 23 10:56:47 crc kubenswrapper[4957]: I0123 10:56:47.957820 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.313184 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.414579 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.645075 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.668952 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.709496 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.775117 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d" path="/var/lib/kubelet/pods/dbd31c27-d2a4-42b2-83c2-b8f410fe6c1d/volumes" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.780084 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.845700 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 23 10:56:48 crc kubenswrapper[4957]: I0123 10:56:48.867235 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 23 10:56:49 crc kubenswrapper[4957]: I0123 10:56:49.088384 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 23 10:56:49 crc kubenswrapper[4957]: I0123 10:56:49.237788 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 23 10:56:49 crc kubenswrapper[4957]: I0123 10:56:49.260619 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 23 10:56:49 crc kubenswrapper[4957]: I0123 10:56:49.279086 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 23 10:56:49 crc kubenswrapper[4957]: I0123 10:56:49.332485 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 23 10:56:49 crc kubenswrapper[4957]: I0123 10:56:49.476023 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 23 10:56:49 crc kubenswrapper[4957]: I0123 10:56:49.597730 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.034931 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.087975 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.103265 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.356995 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.402575 4957 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.402814 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b2d7fd59cbbf3cb5b7c203a5d05b8645d2bb0424a0a049225fd52a21950e9c3b" gracePeriod=5 Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.436459 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.706776 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.740448 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 23 10:56:50 crc kubenswrapper[4957]: I0123 10:56:50.928853 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 23 10:56:50 crc kubenswrapper[4957]: E0123 10:56:50.999401 4957 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 23 10:56:50 crc kubenswrapper[4957]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-vbk8p_openshift-authentication_3d80894e-00e6-4d20-9ce3-398bb8f27e66_0(676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-vbk8p to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41" Netns:"/var/run/netns/b6f18096-1c35-48e8-8fd1-29f3a2e5f815" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-vbk8p;K8S_POD_INFRA_CONTAINER_ID=676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41;K8S_POD_UID=3d80894e-00e6-4d20-9ce3-398bb8f27e66" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p/3d80894e-00e6-4d20-9ce3-398bb8f27e66]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-vbk8p in out of cluster comm: pod "oauth-openshift-7f8484fbcc-vbk8p" not found Jan 23 10:56:50 crc kubenswrapper[4957]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 23 10:56:50 crc kubenswrapper[4957]: > Jan 23 10:56:50 crc kubenswrapper[4957]: E0123 10:56:50.999473 4957 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 23 10:56:50 crc kubenswrapper[4957]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-vbk8p_openshift-authentication_3d80894e-00e6-4d20-9ce3-398bb8f27e66_0(676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-vbk8p to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41" Netns:"/var/run/netns/b6f18096-1c35-48e8-8fd1-29f3a2e5f815" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-vbk8p;K8S_POD_INFRA_CONTAINER_ID=676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41;K8S_POD_UID=3d80894e-00e6-4d20-9ce3-398bb8f27e66" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p/3d80894e-00e6-4d20-9ce3-398bb8f27e66]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-vbk8p in out of cluster comm: pod "oauth-openshift-7f8484fbcc-vbk8p" not found Jan 23 10:56:50 crc kubenswrapper[4957]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 23 10:56:50 crc kubenswrapper[4957]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:50 crc kubenswrapper[4957]: E0123 10:56:50.999496 4957 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Jan 23 10:56:50 crc kubenswrapper[4957]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-vbk8p_openshift-authentication_3d80894e-00e6-4d20-9ce3-398bb8f27e66_0(676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-vbk8p to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41" Netns:"/var/run/netns/b6f18096-1c35-48e8-8fd1-29f3a2e5f815" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-vbk8p;K8S_POD_INFRA_CONTAINER_ID=676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41;K8S_POD_UID=3d80894e-00e6-4d20-9ce3-398bb8f27e66" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p/3d80894e-00e6-4d20-9ce3-398bb8f27e66]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-vbk8p in out of cluster comm: pod "oauth-openshift-7f8484fbcc-vbk8p" not found Jan 23 10:56:50 crc kubenswrapper[4957]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 23 10:56:50 crc kubenswrapper[4957]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:56:50 crc kubenswrapper[4957]: E0123 10:56:50.999561 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7f8484fbcc-vbk8p_openshift-authentication(3d80894e-00e6-4d20-9ce3-398bb8f27e66)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7f8484fbcc-vbk8p_openshift-authentication(3d80894e-00e6-4d20-9ce3-398bb8f27e66)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-vbk8p_openshift-authentication_3d80894e-00e6-4d20-9ce3-398bb8f27e66_0(676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-vbk8p to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41\\\" Netns:\\\"/var/run/netns/b6f18096-1c35-48e8-8fd1-29f3a2e5f815\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-vbk8p;K8S_POD_INFRA_CONTAINER_ID=676e83fd064d69c2a82149f5a6d318a200b9053426ccc3921007833f5ee14c41;K8S_POD_UID=3d80894e-00e6-4d20-9ce3-398bb8f27e66\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p/3d80894e-00e6-4d20-9ce3-398bb8f27e66]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-vbk8p in out of cluster comm: pod \\\"oauth-openshift-7f8484fbcc-vbk8p\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" podUID="3d80894e-00e6-4d20-9ce3-398bb8f27e66" Jan 23 10:56:51 crc kubenswrapper[4957]: I0123 10:56:51.088927 4957 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 23 10:56:51 crc kubenswrapper[4957]: I0123 10:56:51.324356 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 23 10:56:51 crc kubenswrapper[4957]: I0123 10:56:51.441061 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 23 10:56:51 crc kubenswrapper[4957]: I0123 10:56:51.828346 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 23 10:56:51 crc kubenswrapper[4957]: I0123 10:56:51.947147 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 23 10:56:52 crc kubenswrapper[4957]: I0123 10:56:52.180396 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 23 10:56:52 crc kubenswrapper[4957]: I0123 10:56:52.234736 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 23 10:56:52 crc kubenswrapper[4957]: I0123 10:56:52.246537 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 23 10:56:52 crc kubenswrapper[4957]: I0123 10:56:52.409517 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 23 10:56:52 crc kubenswrapper[4957]: I0123 10:56:52.679707 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 23 10:56:52 crc kubenswrapper[4957]: I0123 10:56:52.733764 4957 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 23 10:56:53 crc kubenswrapper[4957]: I0123 10:56:53.257724 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 23 10:56:53 crc kubenswrapper[4957]: I0123 10:56:53.578715 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 23 10:56:53 crc kubenswrapper[4957]: I0123 10:56:53.810695 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.264792 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.408529 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.501136 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.596845 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.620630 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.631584 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.779217 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 23 10:56:54 crc kubenswrapper[4957]: I0123 10:56:54.835515 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.013258 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.019470 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.112896 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.209917 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.349706 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.440749 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.549634 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.708505 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.817517 4957 generic.go:334] "Generic (PLEG): container finished" podID="564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8" containerID="28be68d35637eb2aa8b22156235862d7e6d0ad730ae9d8cf32dbc9b8c094e22a" exitCode=0 Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.817596 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" event={"ID":"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8","Type":"ContainerDied","Data":"28be68d35637eb2aa8b22156235862d7e6d0ad730ae9d8cf32dbc9b8c094e22a"} Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.818128 4957 scope.go:117] "RemoveContainer" containerID="28be68d35637eb2aa8b22156235862d7e6d0ad730ae9d8cf32dbc9b8c094e22a" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.819109 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.819159 4957 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b2d7fd59cbbf3cb5b7c203a5d05b8645d2bb0424a0a049225fd52a21950e9c3b" exitCode=137 Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.969915 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.977005 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.990101 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 23 10:56:55 crc kubenswrapper[4957]: I0123 10:56:55.990168 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.027609 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091269 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091402 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091476 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091499 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091559 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091623 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091649 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091676 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091761 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091943 4957 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091969 4957 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091984 4957 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.091994 4957 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.101011 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.192673 4957 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.194676 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.380924 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.399459 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.553688 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.576743 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.780568 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.823027 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.825252 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.825359 4957 scope.go:117] "RemoveContainer" containerID="b2d7fd59cbbf3cb5b7c203a5d05b8645d2bb0424a0a049225fd52a21950e9c3b" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.825469 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.830749 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" event={"ID":"564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8","Type":"ContainerStarted","Data":"6453ea15c1f33e7ae9aa572cff0d03574dd1544b3241afc24287eee0c839a825"} Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.831155 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.834549 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pq4t4" Jan 23 10:56:56 crc kubenswrapper[4957]: I0123 10:56:56.858770 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.239503 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.318498 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.357558 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.522940 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.664083 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.770401 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.776181 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 23 10:56:57 crc kubenswrapper[4957]: I0123 10:56:57.784649 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 23 10:56:58 crc kubenswrapper[4957]: I0123 10:56:58.078640 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 23 10:56:58 crc kubenswrapper[4957]: I0123 10:56:58.188516 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 23 10:56:58 crc kubenswrapper[4957]: I0123 10:56:58.333121 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 23 10:56:58 crc kubenswrapper[4957]: I0123 10:56:58.502060 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.066029 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.472774 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f6857c485-zm57k"] Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.473400 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" podUID="1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" containerName="controller-manager" containerID="cri-o://a932ded0eec0316dbcf394a6add196e5056c48b5d2a44b51bb89191816ed38b0" gracePeriod=30 Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.571889 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp"] Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.572101 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" podUID="336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" containerName="route-controller-manager" containerID="cri-o://78f9c881642cb9ebe0ccdd114954d397ed6bb18b0060ac736c2c0224fbb92f60" gracePeriod=30 Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.612791 4957 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.729476 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.778905 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.848590 4957 generic.go:334] "Generic (PLEG): container finished" podID="336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" containerID="78f9c881642cb9ebe0ccdd114954d397ed6bb18b0060ac736c2c0224fbb92f60" exitCode=0 Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.848690 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" event={"ID":"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1","Type":"ContainerDied","Data":"78f9c881642cb9ebe0ccdd114954d397ed6bb18b0060ac736c2c0224fbb92f60"} Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.850814 4957 generic.go:334] "Generic (PLEG): container finished" podID="1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" containerID="a932ded0eec0316dbcf394a6add196e5056c48b5d2a44b51bb89191816ed38b0" exitCode=0 Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.850870 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" event={"ID":"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab","Type":"ContainerDied","Data":"a932ded0eec0316dbcf394a6add196e5056c48b5d2a44b51bb89191816ed38b0"} Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.850905 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" event={"ID":"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab","Type":"ContainerDied","Data":"096f3ecb1a1f9e6b3a5fc96d43dbd0a2ec9925c8d68c1e055fb59cc16720974c"} Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.850919 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="096f3ecb1a1f9e6b3a5fc96d43dbd0a2ec9925c8d68c1e055fb59cc16720974c" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.873244 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.949603 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wqdw\" (UniqueName: \"kubernetes.io/projected/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-kube-api-access-6wqdw\") pod \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.949647 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-serving-cert\") pod \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.949694 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-proxy-ca-bundles\") pod \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.949766 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-config\") pod \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.949795 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-client-ca\") pod \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\" (UID: \"1b54be7c-8091-4cb4-bca2-8bb0691cd1ab\") " Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.951055 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" (UID: "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.952084 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" (UID: "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.952861 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-config" (OuterVolumeSpecName: "config") pod "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" (UID: "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.971524 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" (UID: "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.971813 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.980033 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-kube-api-access-6wqdw" (OuterVolumeSpecName: "kube-api-access-6wqdw") pod "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" (UID: "1b54be7c-8091-4cb4-bca2-8bb0691cd1ab"). InnerVolumeSpecName "kube-api-access-6wqdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:56:59 crc kubenswrapper[4957]: I0123 10:56:59.980219 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.031926 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.050624 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-config\") pod \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.050690 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-serving-cert\") pod \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.050713 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-client-ca\") pod \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.050791 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hht6r\" (UniqueName: \"kubernetes.io/projected/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-kube-api-access-hht6r\") pod \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\" (UID: \"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1\") " Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.051002 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wqdw\" (UniqueName: \"kubernetes.io/projected/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-kube-api-access-6wqdw\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.051017 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.051025 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.051034 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.051042 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.051581 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-client-ca" (OuterVolumeSpecName: "client-ca") pod "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" (UID: "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.051598 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-config" (OuterVolumeSpecName: "config") pod "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" (UID: "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.053767 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" (UID: "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.054444 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-kube-api-access-hht6r" (OuterVolumeSpecName: "kube-api-access-hht6r") pod "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" (UID: "336f11b9-7c5e-47c9-86cd-8fffc5f6caf1"). InnerVolumeSpecName "kube-api-access-hht6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.124333 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.151882 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.151915 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.151924 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.151935 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hht6r\" (UniqueName: \"kubernetes.io/projected/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1-kube-api-access-hht6r\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.395491 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.650084 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-brfvx"] Jan 23 10:57:00 crc kubenswrapper[4957]: E0123 10:57:00.650512 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.650539 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 23 10:57:00 crc kubenswrapper[4957]: E0123 10:57:00.650594 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" containerName="route-controller-manager" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.650611 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" containerName="route-controller-manager" Jan 23 10:57:00 crc kubenswrapper[4957]: E0123 10:57:00.650632 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" containerName="controller-manager" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.650650 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" containerName="controller-manager" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.650856 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" containerName="route-controller-manager" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.650886 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" containerName="controller-manager" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.650910 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.651741 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.653879 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg"] Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.654843 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.669382 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-brfvx"] Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.677449 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg"] Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.757758 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-client-ca\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758160 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52d56b3-bdc9-4c7e-8ebb-35302e811647-serving-cert\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758197 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-config\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758217 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-serving-cert\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758240 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-config\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758256 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-proxy-ca-bundles\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758306 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78msd\" (UniqueName: \"kubernetes.io/projected/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-kube-api-access-78msd\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758331 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-client-ca\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.758544 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvqc\" (UniqueName: \"kubernetes.io/projected/e52d56b3-bdc9-4c7e-8ebb-35302e811647-kube-api-access-fmvqc\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.765641 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.828934 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.858446 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6857c485-zm57k" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.858468 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.858463 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp" event={"ID":"336f11b9-7c5e-47c9-86cd-8fffc5f6caf1","Type":"ContainerDied","Data":"6d49c59e16cddd823b84aaa6fdad76811e85076757cb2d1212c22c54a396d132"} Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.858561 4957 scope.go:117] "RemoveContainer" containerID="78f9c881642cb9ebe0ccdd114954d397ed6bb18b0060ac736c2c0224fbb92f60" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.860541 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-client-ca\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.860595 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52d56b3-bdc9-4c7e-8ebb-35302e811647-serving-cert\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.860668 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-config\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.860703 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-serving-cert\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.860729 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-config\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.860751 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-proxy-ca-bundles\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.860805 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78msd\" (UniqueName: \"kubernetes.io/projected/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-kube-api-access-78msd\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.861738 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-client-ca\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.861837 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvqc\" (UniqueName: \"kubernetes.io/projected/e52d56b3-bdc9-4c7e-8ebb-35302e811647-kube-api-access-fmvqc\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.861661 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-client-ca\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.862699 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-client-ca\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.863006 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-config\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.863465 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-proxy-ca-bundles\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.864071 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-config\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.868242 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52d56b3-bdc9-4c7e-8ebb-35302e811647-serving-cert\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.870362 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-serving-cert\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.882049 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvqc\" (UniqueName: \"kubernetes.io/projected/e52d56b3-bdc9-4c7e-8ebb-35302e811647-kube-api-access-fmvqc\") pod \"route-controller-manager-5c857f95bd-qfvdg\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.883173 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f6857c485-zm57k"] Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.884696 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78msd\" (UniqueName: \"kubernetes.io/projected/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-kube-api-access-78msd\") pod \"controller-manager-b99f7d5cb-brfvx\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.887452 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f6857c485-zm57k"] Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.900765 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp"] Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.907676 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c54fd9bc-zh7lp"] Jan 23 10:57:00 crc kubenswrapper[4957]: I0123 10:57:00.985359 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.002055 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.041707 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.042902 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.063685 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.182756 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-brfvx"] Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.237114 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg"] Jan 23 10:57:01 crc kubenswrapper[4957]: W0123 10:57:01.241616 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52d56b3_bdc9_4c7e_8ebb_35302e811647.slice/crio-558678c105d84d7e483631c84f83334e85d3556c45f5d7a3e464d6ce8083f4bb WatchSource:0}: Error finding container 558678c105d84d7e483631c84f83334e85d3556c45f5d7a3e464d6ce8083f4bb: Status 404 returned error can't find the container with id 558678c105d84d7e483631c84f83334e85d3556c45f5d7a3e464d6ce8083f4bb Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.519006 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.565761 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.768799 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.769246 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.797736 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.866328 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" event={"ID":"e52d56b3-bdc9-4c7e-8ebb-35302e811647","Type":"ContainerStarted","Data":"2d41a0569bbae0bd2046bc3f8bb9659cbb94736d9a3abfc7663fd4ea082dd0ad"} Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.866380 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" event={"ID":"e52d56b3-bdc9-4c7e-8ebb-35302e811647","Type":"ContainerStarted","Data":"558678c105d84d7e483631c84f83334e85d3556c45f5d7a3e464d6ce8083f4bb"} Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.866570 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.873653 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" event={"ID":"7b4feff5-e35c-4518-a9b8-4e39d86c8d56","Type":"ContainerStarted","Data":"761f9f5e11a7afbeda52e784bc01c67a21247c86138e4b5191845fb93aa9f29e"} Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.873707 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" event={"ID":"7b4feff5-e35c-4518-a9b8-4e39d86c8d56","Type":"ContainerStarted","Data":"7a3c544d6b11d42aa9c67b12d2c82b3c699bf234d6da9544b4d56999e529053f"} Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.874382 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.878880 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.885714 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" podStartSLOduration=2.885688288 podStartE2EDuration="2.885688288s" podCreationTimestamp="2026-01-23 10:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:57:01.884694819 +0000 UTC m=+331.421947526" watchObservedRunningTime="2026-01-23 10:57:01.885688288 +0000 UTC m=+331.422940985" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.993776 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" podStartSLOduration=2.993759038 podStartE2EDuration="2.993759038s" podCreationTimestamp="2026-01-23 10:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:57:01.906688076 +0000 UTC m=+331.443940763" watchObservedRunningTime="2026-01-23 10:57:01.993759038 +0000 UTC m=+331.531011725" Jan 23 10:57:01 crc kubenswrapper[4957]: I0123 10:57:01.995092 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p"] Jan 23 10:57:02 crc kubenswrapper[4957]: W0123 10:57:02.006473 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d80894e_00e6_4d20_9ce3_398bb8f27e66.slice/crio-0c9208ab6f0815ed03bd94994a36a346e5e9f798b78cd7f46b1cbce3e8cc1320 WatchSource:0}: Error finding container 0c9208ab6f0815ed03bd94994a36a346e5e9f798b78cd7f46b1cbce3e8cc1320: Status 404 returned error can't find the container with id 0c9208ab6f0815ed03bd94994a36a346e5e9f798b78cd7f46b1cbce3e8cc1320 Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.068275 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.152754 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.354924 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.499709 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.561087 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.612795 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.747371 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.753190 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.777172 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b54be7c-8091-4cb4-bca2-8bb0691cd1ab" path="/var/lib/kubelet/pods/1b54be7c-8091-4cb4-bca2-8bb0691cd1ab/volumes" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.777699 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336f11b9-7c5e-47c9-86cd-8fffc5f6caf1" path="/var/lib/kubelet/pods/336f11b9-7c5e-47c9-86cd-8fffc5f6caf1/volumes" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.778731 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.880616 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" event={"ID":"3d80894e-00e6-4d20-9ce3-398bb8f27e66","Type":"ContainerStarted","Data":"e1c5dff365b677dc35eb771d61e7d59204f4f76c06baf59b069eaf534c0562d0"} Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.880680 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" event={"ID":"3d80894e-00e6-4d20-9ce3-398bb8f27e66","Type":"ContainerStarted","Data":"0c9208ab6f0815ed03bd94994a36a346e5e9f798b78cd7f46b1cbce3e8cc1320"} Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.904102 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" podStartSLOduration=99.904087254 podStartE2EDuration="1m39.904087254s" podCreationTimestamp="2026-01-23 10:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:57:02.901102035 +0000 UTC m=+332.438354722" watchObservedRunningTime="2026-01-23 10:57:02.904087254 +0000 UTC m=+332.441339941" Jan 23 10:57:02 crc kubenswrapper[4957]: I0123 10:57:02.929317 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 23 10:57:03 crc kubenswrapper[4957]: I0123 10:57:03.093569 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 23 10:57:03 crc kubenswrapper[4957]: I0123 10:57:03.122883 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 23 10:57:03 crc kubenswrapper[4957]: I0123 10:57:03.138400 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 23 10:57:03 crc kubenswrapper[4957]: I0123 10:57:03.401971 4957 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 23 10:57:03 crc kubenswrapper[4957]: I0123 10:57:03.762843 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 23 10:57:03 crc kubenswrapper[4957]: I0123 10:57:03.886804 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:57:03 crc kubenswrapper[4957]: I0123 10:57:03.893289 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7f8484fbcc-vbk8p" Jan 23 10:57:04 crc kubenswrapper[4957]: I0123 10:57:04.223190 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 23 10:57:04 crc kubenswrapper[4957]: I0123 10:57:04.372467 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 23 10:57:05 crc kubenswrapper[4957]: I0123 10:57:05.158603 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 23 10:57:05 crc kubenswrapper[4957]: I0123 10:57:05.305312 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 23 10:57:05 crc kubenswrapper[4957]: I0123 10:57:05.475438 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 23 10:57:05 crc kubenswrapper[4957]: I0123 10:57:05.671324 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 23 10:57:05 crc kubenswrapper[4957]: I0123 10:57:05.973408 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 23 10:57:06 crc kubenswrapper[4957]: I0123 10:57:06.260237 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 23 10:57:06 crc kubenswrapper[4957]: I0123 10:57:06.946651 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 23 10:57:07 crc kubenswrapper[4957]: I0123 10:57:07.725590 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 23 10:57:08 crc kubenswrapper[4957]: I0123 10:57:08.178184 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 23 10:57:08 crc kubenswrapper[4957]: I0123 10:57:08.455438 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 23 10:57:08 crc kubenswrapper[4957]: I0123 10:57:08.561233 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 23 10:57:09 crc kubenswrapper[4957]: I0123 10:57:09.338849 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 23 10:57:10 crc kubenswrapper[4957]: I0123 10:57:10.076126 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 23 10:57:10 crc kubenswrapper[4957]: I0123 10:57:10.526919 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.463735 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" podUID="e8dec53f-51c9-4e3d-b111-45152d5b0c71" containerName="registry" containerID="cri-o://865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b" gracePeriod=30 Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.913961 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.942340 4957 generic.go:334] "Generic (PLEG): container finished" podID="e8dec53f-51c9-4e3d-b111-45152d5b0c71" containerID="865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b" exitCode=0 Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.942381 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" event={"ID":"e8dec53f-51c9-4e3d-b111-45152d5b0c71","Type":"ContainerDied","Data":"865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b"} Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.942407 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" event={"ID":"e8dec53f-51c9-4e3d-b111-45152d5b0c71","Type":"ContainerDied","Data":"fe27ac06f9ec11060c6e98190f122fef33d38c7771103781850cf7a5dc35db47"} Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.942422 4957 scope.go:117] "RemoveContainer" containerID="865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b" Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.942526 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:57:12 crc kubenswrapper[4957]: E0123 10:57:12.942690 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="mounted volumes=[bound-sa-token ca-trust-extracted installation-pull-secrets kube-api-access-nc9g4 registry-certificates registry-storage registry-tls trusted-ca]: error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" podUID="e8dec53f-51c9-4e3d-b111-45152d5b0c71" Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.963355 4957 scope.go:117] "RemoveContainer" containerID="865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b" Jan 23 10:57:12 crc kubenswrapper[4957]: E0123 10:57:12.963798 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b\": container with ID starting with 865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b not found: ID does not exist" containerID="865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b" Jan 23 10:57:12 crc kubenswrapper[4957]: I0123 10:57:12.963846 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b"} err="failed to get container status \"865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b\": rpc error: code = NotFound desc = could not find container \"865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b\": container with ID starting with 865149ad3d793bd145d4f28055a8b145b6fbf740f68717bb8ec07cc81d0f1f4b not found: ID does not exist" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.041826 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-certificates\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.041946 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8dec53f-51c9-4e3d-b111-45152d5b0c71-installation-pull-secrets\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.041977 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-trusted-ca\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.042123 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.042173 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8dec53f-51c9-4e3d-b111-45152d5b0c71-ca-trust-extracted\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.042218 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-tls\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.042242 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-bound-sa-token\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.042332 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc9g4\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-kube-api-access-nc9g4\") pod \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\" (UID: \"e8dec53f-51c9-4e3d-b111-45152d5b0c71\") " Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.043228 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.043715 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.063955 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8dec53f-51c9-4e3d-b111-45152d5b0c71-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.064119 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.064343 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.064458 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.064491 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-kube-api-access-nc9g4" (OuterVolumeSpecName: "kube-api-access-nc9g4") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "kube-api-access-nc9g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.079646 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8dec53f-51c9-4e3d-b111-45152d5b0c71-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e8dec53f-51c9-4e3d-b111-45152d5b0c71" (UID: "e8dec53f-51c9-4e3d-b111-45152d5b0c71"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.144658 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc9g4\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-kube-api-access-nc9g4\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.144697 4957 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.144706 4957 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8dec53f-51c9-4e3d-b111-45152d5b0c71-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.144717 4957 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8dec53f-51c9-4e3d-b111-45152d5b0c71-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.144729 4957 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8dec53f-51c9-4e3d-b111-45152d5b0c71-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.144738 4957 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.144749 4957 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8dec53f-51c9-4e3d-b111-45152d5b0c71-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.951036 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hgc64" Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.993813 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hgc64"] Jan 23 10:57:13 crc kubenswrapper[4957]: I0123 10:57:13.998364 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hgc64"] Jan 23 10:57:14 crc kubenswrapper[4957]: I0123 10:57:14.776589 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8dec53f-51c9-4e3d-b111-45152d5b0c71" path="/var/lib/kubelet/pods/e8dec53f-51c9-4e3d-b111-45152d5b0c71/volumes" Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.489413 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg"] Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.495424 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" podUID="e52d56b3-bdc9-4c7e-8ebb-35302e811647" containerName="route-controller-manager" containerID="cri-o://2d41a0569bbae0bd2046bc3f8bb9659cbb94736d9a3abfc7663fd4ea082dd0ad" gracePeriod=30 Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.512756 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-brfvx"] Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.513377 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" podUID="7b4feff5-e35c-4518-a9b8-4e39d86c8d56" containerName="controller-manager" containerID="cri-o://761f9f5e11a7afbeda52e784bc01c67a21247c86138e4b5191845fb93aa9f29e" gracePeriod=30 Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.994206 4957 generic.go:334] "Generic (PLEG): container finished" podID="7b4feff5-e35c-4518-a9b8-4e39d86c8d56" containerID="761f9f5e11a7afbeda52e784bc01c67a21247c86138e4b5191845fb93aa9f29e" exitCode=0 Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.994271 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" event={"ID":"7b4feff5-e35c-4518-a9b8-4e39d86c8d56","Type":"ContainerDied","Data":"761f9f5e11a7afbeda52e784bc01c67a21247c86138e4b5191845fb93aa9f29e"} Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.995736 4957 generic.go:334] "Generic (PLEG): container finished" podID="e52d56b3-bdc9-4c7e-8ebb-35302e811647" containerID="2d41a0569bbae0bd2046bc3f8bb9659cbb94736d9a3abfc7663fd4ea082dd0ad" exitCode=0 Jan 23 10:57:19 crc kubenswrapper[4957]: I0123 10:57:19.995793 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" event={"ID":"e52d56b3-bdc9-4c7e-8ebb-35302e811647","Type":"ContainerDied","Data":"2d41a0569bbae0bd2046bc3f8bb9659cbb94736d9a3abfc7663fd4ea082dd0ad"} Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.045621 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.051888 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148687 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52d56b3-bdc9-4c7e-8ebb-35302e811647-serving-cert\") pod \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148745 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-serving-cert\") pod \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148799 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-config\") pod \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148840 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78msd\" (UniqueName: \"kubernetes.io/projected/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-kube-api-access-78msd\") pod \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148860 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvqc\" (UniqueName: \"kubernetes.io/projected/e52d56b3-bdc9-4c7e-8ebb-35302e811647-kube-api-access-fmvqc\") pod \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148876 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-proxy-ca-bundles\") pod \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148914 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-config\") pod \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148971 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-client-ca\") pod \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\" (UID: \"7b4feff5-e35c-4518-a9b8-4e39d86c8d56\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.148987 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-client-ca\") pod \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\" (UID: \"e52d56b3-bdc9-4c7e-8ebb-35302e811647\") " Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.150131 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-client-ca" (OuterVolumeSpecName: "client-ca") pod "e52d56b3-bdc9-4c7e-8ebb-35302e811647" (UID: "e52d56b3-bdc9-4c7e-8ebb-35302e811647"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.150226 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-config" (OuterVolumeSpecName: "config") pod "e52d56b3-bdc9-4c7e-8ebb-35302e811647" (UID: "e52d56b3-bdc9-4c7e-8ebb-35302e811647"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.150904 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.150995 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e52d56b3-bdc9-4c7e-8ebb-35302e811647-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.150934 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-client-ca" (OuterVolumeSpecName: "client-ca") pod "7b4feff5-e35c-4518-a9b8-4e39d86c8d56" (UID: "7b4feff5-e35c-4518-a9b8-4e39d86c8d56"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.150978 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-config" (OuterVolumeSpecName: "config") pod "7b4feff5-e35c-4518-a9b8-4e39d86c8d56" (UID: "7b4feff5-e35c-4518-a9b8-4e39d86c8d56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.151044 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7b4feff5-e35c-4518-a9b8-4e39d86c8d56" (UID: "7b4feff5-e35c-4518-a9b8-4e39d86c8d56"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.154709 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52d56b3-bdc9-4c7e-8ebb-35302e811647-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e52d56b3-bdc9-4c7e-8ebb-35302e811647" (UID: "e52d56b3-bdc9-4c7e-8ebb-35302e811647"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.154819 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7b4feff5-e35c-4518-a9b8-4e39d86c8d56" (UID: "7b4feff5-e35c-4518-a9b8-4e39d86c8d56"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.154965 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52d56b3-bdc9-4c7e-8ebb-35302e811647-kube-api-access-fmvqc" (OuterVolumeSpecName: "kube-api-access-fmvqc") pod "e52d56b3-bdc9-4c7e-8ebb-35302e811647" (UID: "e52d56b3-bdc9-4c7e-8ebb-35302e811647"). InnerVolumeSpecName "kube-api-access-fmvqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.155517 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-kube-api-access-78msd" (OuterVolumeSpecName: "kube-api-access-78msd") pod "7b4feff5-e35c-4518-a9b8-4e39d86c8d56" (UID: "7b4feff5-e35c-4518-a9b8-4e39d86c8d56"). InnerVolumeSpecName "kube-api-access-78msd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.252592 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.252634 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52d56b3-bdc9-4c7e-8ebb-35302e811647-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.252645 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.252657 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.252670 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78msd\" (UniqueName: \"kubernetes.io/projected/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-kube-api-access-78msd\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.252685 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmvqc\" (UniqueName: \"kubernetes.io/projected/e52d56b3-bdc9-4c7e-8ebb-35302e811647-kube-api-access-fmvqc\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.252696 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b4feff5-e35c-4518-a9b8-4e39d86c8d56-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.665209 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64"] Jan 23 10:57:20 crc kubenswrapper[4957]: E0123 10:57:20.665852 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4feff5-e35c-4518-a9b8-4e39d86c8d56" containerName="controller-manager" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.665881 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4feff5-e35c-4518-a9b8-4e39d86c8d56" containerName="controller-manager" Jan 23 10:57:20 crc kubenswrapper[4957]: E0123 10:57:20.665902 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52d56b3-bdc9-4c7e-8ebb-35302e811647" containerName="route-controller-manager" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.665918 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52d56b3-bdc9-4c7e-8ebb-35302e811647" containerName="route-controller-manager" Jan 23 10:57:20 crc kubenswrapper[4957]: E0123 10:57:20.665953 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8dec53f-51c9-4e3d-b111-45152d5b0c71" containerName="registry" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.665971 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8dec53f-51c9-4e3d-b111-45152d5b0c71" containerName="registry" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.666201 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8dec53f-51c9-4e3d-b111-45152d5b0c71" containerName="registry" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.666229 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52d56b3-bdc9-4c7e-8ebb-35302e811647" containerName="route-controller-manager" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.666248 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4feff5-e35c-4518-a9b8-4e39d86c8d56" containerName="controller-manager" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.667033 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.671338 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74577df4c5-62dgw"] Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.672413 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.684378 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64"] Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.708421 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74577df4c5-62dgw"] Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759006 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-client-ca\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759065 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56m75\" (UniqueName: \"kubernetes.io/projected/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-kube-api-access-56m75\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759105 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-proxy-ca-bundles\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759131 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-config\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759162 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a1bf7e-168b-45a3-b91f-9b131130b358-serving-cert\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759196 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-client-ca\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759239 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-serving-cert\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759312 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-config\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.759346 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7wf\" (UniqueName: \"kubernetes.io/projected/b1a1bf7e-168b-45a3-b91f-9b131130b358-kube-api-access-jk7wf\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.860965 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56m75\" (UniqueName: \"kubernetes.io/projected/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-kube-api-access-56m75\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861027 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-proxy-ca-bundles\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861061 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-config\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861101 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a1bf7e-168b-45a3-b91f-9b131130b358-serving-cert\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861126 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-client-ca\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861157 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-serving-cert\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861179 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-config\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861205 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7wf\" (UniqueName: \"kubernetes.io/projected/b1a1bf7e-168b-45a3-b91f-9b131130b358-kube-api-access-jk7wf\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.861265 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-client-ca\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.862168 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-client-ca\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.863085 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-client-ca\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.863627 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-config\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.864575 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-config\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.864742 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-proxy-ca-bundles\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.867462 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-serving-cert\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.868041 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a1bf7e-168b-45a3-b91f-9b131130b358-serving-cert\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.881916 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7wf\" (UniqueName: \"kubernetes.io/projected/b1a1bf7e-168b-45a3-b91f-9b131130b358-kube-api-access-jk7wf\") pod \"controller-manager-74577df4c5-62dgw\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.891046 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56m75\" (UniqueName: \"kubernetes.io/projected/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-kube-api-access-56m75\") pod \"route-controller-manager-5c6ddf959-tkc64\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:20 crc kubenswrapper[4957]: I0123 10:57:20.999902 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.005135 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" event={"ID":"e52d56b3-bdc9-4c7e-8ebb-35302e811647","Type":"ContainerDied","Data":"558678c105d84d7e483631c84f83334e85d3556c45f5d7a3e464d6ce8083f4bb"} Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.005219 4957 scope.go:117] "RemoveContainer" containerID="2d41a0569bbae0bd2046bc3f8bb9659cbb94736d9a3abfc7663fd4ea082dd0ad" Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.005195 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg" Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.007126 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" event={"ID":"7b4feff5-e35c-4518-a9b8-4e39d86c8d56","Type":"ContainerDied","Data":"7a3c544d6b11d42aa9c67b12d2c82b3c699bf234d6da9544b4d56999e529053f"} Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.007325 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99f7d5cb-brfvx" Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.018220 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.035611 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg"] Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.042374 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-qfvdg"] Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.051001 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-brfvx"] Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.060974 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-brfvx"] Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.071769 4957 scope.go:117] "RemoveContainer" containerID="761f9f5e11a7afbeda52e784bc01c67a21247c86138e4b5191845fb93aa9f29e" Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.232473 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64"] Jan 23 10:57:21 crc kubenswrapper[4957]: W0123 10:57:21.237139 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e23c7f_b138_4218_ad40_a45ffe3c9bfa.slice/crio-39013b3b102c5b9e4123f83ddc04a3916ab765c93dc22137624ffa6846562e5c WatchSource:0}: Error finding container 39013b3b102c5b9e4123f83ddc04a3916ab765c93dc22137624ffa6846562e5c: Status 404 returned error can't find the container with id 39013b3b102c5b9e4123f83ddc04a3916ab765c93dc22137624ffa6846562e5c Jan 23 10:57:21 crc kubenswrapper[4957]: I0123 10:57:21.292033 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74577df4c5-62dgw"] Jan 23 10:57:21 crc kubenswrapper[4957]: W0123 10:57:21.294768 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a1bf7e_168b_45a3_b91f_9b131130b358.slice/crio-8676d3cd6ff2b1789a289fbf22c0f648ee23e492e87928c5949d6bdaa75fc7fd WatchSource:0}: Error finding container 8676d3cd6ff2b1789a289fbf22c0f648ee23e492e87928c5949d6bdaa75fc7fd: Status 404 returned error can't find the container with id 8676d3cd6ff2b1789a289fbf22c0f648ee23e492e87928c5949d6bdaa75fc7fd Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.013890 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" event={"ID":"b1a1bf7e-168b-45a3-b91f-9b131130b358","Type":"ContainerStarted","Data":"209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7"} Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.014230 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" event={"ID":"b1a1bf7e-168b-45a3-b91f-9b131130b358","Type":"ContainerStarted","Data":"8676d3cd6ff2b1789a289fbf22c0f648ee23e492e87928c5949d6bdaa75fc7fd"} Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.015646 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.016539 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" event={"ID":"93e23c7f-b138-4218-ad40-a45ffe3c9bfa","Type":"ContainerStarted","Data":"9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784"} Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.016578 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" event={"ID":"93e23c7f-b138-4218-ad40-a45ffe3c9bfa","Type":"ContainerStarted","Data":"39013b3b102c5b9e4123f83ddc04a3916ab765c93dc22137624ffa6846562e5c"} Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.016869 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.024567 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.043826 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" podStartSLOduration=3.043804648 podStartE2EDuration="3.043804648s" podCreationTimestamp="2026-01-23 10:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:57:22.039138638 +0000 UTC m=+351.576391335" watchObservedRunningTime="2026-01-23 10:57:22.043804648 +0000 UTC m=+351.581057375" Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.065051 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" podStartSLOduration=3.065029612 podStartE2EDuration="3.065029612s" podCreationTimestamp="2026-01-23 10:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:57:22.057679842 +0000 UTC m=+351.594932549" watchObservedRunningTime="2026-01-23 10:57:22.065029612 +0000 UTC m=+351.602282299" Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.208820 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.779546 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4feff5-e35c-4518-a9b8-4e39d86c8d56" path="/var/lib/kubelet/pods/7b4feff5-e35c-4518-a9b8-4e39d86c8d56/volumes" Jan 23 10:57:22 crc kubenswrapper[4957]: I0123 10:57:22.780081 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52d56b3-bdc9-4c7e-8ebb-35302e811647" path="/var/lib/kubelet/pods/e52d56b3-bdc9-4c7e-8ebb-35302e811647/volumes" Jan 23 10:57:45 crc kubenswrapper[4957]: I0123 10:57:45.717264 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 10:57:45 crc kubenswrapper[4957]: I0123 10:57:45.718036 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 10:57:59 crc kubenswrapper[4957]: I0123 10:57:59.455606 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64"] Jan 23 10:57:59 crc kubenswrapper[4957]: I0123 10:57:59.456426 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" podUID="93e23c7f-b138-4218-ad40-a45ffe3c9bfa" containerName="route-controller-manager" containerID="cri-o://9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784" gracePeriod=30 Jan 23 10:57:59 crc kubenswrapper[4957]: I0123 10:57:59.847844 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.028865 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56m75\" (UniqueName: \"kubernetes.io/projected/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-kube-api-access-56m75\") pod \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.028980 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-config\") pod \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.029012 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-serving-cert\") pod \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.029032 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-client-ca\") pod \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\" (UID: \"93e23c7f-b138-4218-ad40-a45ffe3c9bfa\") " Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.029874 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-client-ca" (OuterVolumeSpecName: "client-ca") pod "93e23c7f-b138-4218-ad40-a45ffe3c9bfa" (UID: "93e23c7f-b138-4218-ad40-a45ffe3c9bfa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.030699 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-config" (OuterVolumeSpecName: "config") pod "93e23c7f-b138-4218-ad40-a45ffe3c9bfa" (UID: "93e23c7f-b138-4218-ad40-a45ffe3c9bfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.036627 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "93e23c7f-b138-4218-ad40-a45ffe3c9bfa" (UID: "93e23c7f-b138-4218-ad40-a45ffe3c9bfa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.036683 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-kube-api-access-56m75" (OuterVolumeSpecName: "kube-api-access-56m75") pod "93e23c7f-b138-4218-ad40-a45ffe3c9bfa" (UID: "93e23c7f-b138-4218-ad40-a45ffe3c9bfa"). InnerVolumeSpecName "kube-api-access-56m75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.130478 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.130512 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.130523 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.130535 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56m75\" (UniqueName: \"kubernetes.io/projected/93e23c7f-b138-4218-ad40-a45ffe3c9bfa-kube-api-access-56m75\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.235492 4957 generic.go:334] "Generic (PLEG): container finished" podID="93e23c7f-b138-4218-ad40-a45ffe3c9bfa" containerID="9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784" exitCode=0 Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.235539 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" event={"ID":"93e23c7f-b138-4218-ad40-a45ffe3c9bfa","Type":"ContainerDied","Data":"9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784"} Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.235565 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" event={"ID":"93e23c7f-b138-4218-ad40-a45ffe3c9bfa","Type":"ContainerDied","Data":"39013b3b102c5b9e4123f83ddc04a3916ab765c93dc22137624ffa6846562e5c"} Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.235566 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.235583 4957 scope.go:117] "RemoveContainer" containerID="9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.257766 4957 scope.go:117] "RemoveContainer" containerID="9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784" Jan 23 10:58:00 crc kubenswrapper[4957]: E0123 10:58:00.258340 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784\": container with ID starting with 9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784 not found: ID does not exist" containerID="9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.258393 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784"} err="failed to get container status \"9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784\": rpc error: code = NotFound desc = could not find container \"9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784\": container with ID starting with 9f37a30f466eb13a6f6521a6885edd8f67ef999ce8188bb9f620cbed44d62784 not found: ID does not exist" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.273066 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64"] Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.278260 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6ddf959-tkc64"] Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.696141 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq"] Jan 23 10:58:00 crc kubenswrapper[4957]: E0123 10:58:00.696477 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e23c7f-b138-4218-ad40-a45ffe3c9bfa" containerName="route-controller-manager" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.696492 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e23c7f-b138-4218-ad40-a45ffe3c9bfa" containerName="route-controller-manager" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.696634 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e23c7f-b138-4218-ad40-a45ffe3c9bfa" containerName="route-controller-manager" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.697786 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.699838 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.700011 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.700009 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.701326 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.701383 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.703762 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.703943 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq"] Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.735944 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04342eee-af09-4d7e-b386-66ef20696fcd-config\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.736000 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04342eee-af09-4d7e-b386-66ef20696fcd-client-ca\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.736028 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf9c7\" (UniqueName: \"kubernetes.io/projected/04342eee-af09-4d7e-b386-66ef20696fcd-kube-api-access-bf9c7\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.736085 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04342eee-af09-4d7e-b386-66ef20696fcd-serving-cert\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.776452 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e23c7f-b138-4218-ad40-a45ffe3c9bfa" path="/var/lib/kubelet/pods/93e23c7f-b138-4218-ad40-a45ffe3c9bfa/volumes" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.836968 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04342eee-af09-4d7e-b386-66ef20696fcd-serving-cert\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.837034 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04342eee-af09-4d7e-b386-66ef20696fcd-config\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.837085 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04342eee-af09-4d7e-b386-66ef20696fcd-client-ca\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.837130 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf9c7\" (UniqueName: \"kubernetes.io/projected/04342eee-af09-4d7e-b386-66ef20696fcd-kube-api-access-bf9c7\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.838570 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04342eee-af09-4d7e-b386-66ef20696fcd-config\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.839342 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04342eee-af09-4d7e-b386-66ef20696fcd-client-ca\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.845240 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04342eee-af09-4d7e-b386-66ef20696fcd-serving-cert\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:00 crc kubenswrapper[4957]: I0123 10:58:00.866958 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf9c7\" (UniqueName: \"kubernetes.io/projected/04342eee-af09-4d7e-b386-66ef20696fcd-kube-api-access-bf9c7\") pod \"route-controller-manager-5c857f95bd-snkrq\" (UID: \"04342eee-af09-4d7e-b386-66ef20696fcd\") " pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:01 crc kubenswrapper[4957]: I0123 10:58:01.022047 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:01 crc kubenswrapper[4957]: I0123 10:58:01.419936 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq"] Jan 23 10:58:02 crc kubenswrapper[4957]: I0123 10:58:02.249611 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" event={"ID":"04342eee-af09-4d7e-b386-66ef20696fcd","Type":"ContainerStarted","Data":"8f7fbadeed4d6ef3ba937f67b24c40022224637a735d4468418de53d031fdb15"} Jan 23 10:58:02 crc kubenswrapper[4957]: I0123 10:58:02.249686 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" event={"ID":"04342eee-af09-4d7e-b386-66ef20696fcd","Type":"ContainerStarted","Data":"1d24746f4409e3dd851b9f5cb1400bcaf64f1841cf25797a87efe1232f12843e"} Jan 23 10:58:02 crc kubenswrapper[4957]: I0123 10:58:02.249907 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:02 crc kubenswrapper[4957]: I0123 10:58:02.274038 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" podStartSLOduration=3.274021501 podStartE2EDuration="3.274021501s" podCreationTimestamp="2026-01-23 10:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:58:02.27167641 +0000 UTC m=+391.808929157" watchObservedRunningTime="2026-01-23 10:58:02.274021501 +0000 UTC m=+391.811274188" Jan 23 10:58:02 crc kubenswrapper[4957]: I0123 10:58:02.355664 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c857f95bd-snkrq" Jan 23 10:58:15 crc kubenswrapper[4957]: I0123 10:58:15.717729 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 10:58:15 crc kubenswrapper[4957]: I0123 10:58:15.718468 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.451413 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74577df4c5-62dgw"] Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.451949 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" podUID="b1a1bf7e-168b-45a3-b91f-9b131130b358" containerName="controller-manager" containerID="cri-o://209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7" gracePeriod=30 Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.878094 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.981728 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-proxy-ca-bundles\") pod \"b1a1bf7e-168b-45a3-b91f-9b131130b358\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.981830 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a1bf7e-168b-45a3-b91f-9b131130b358-serving-cert\") pod \"b1a1bf7e-168b-45a3-b91f-9b131130b358\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.981878 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk7wf\" (UniqueName: \"kubernetes.io/projected/b1a1bf7e-168b-45a3-b91f-9b131130b358-kube-api-access-jk7wf\") pod \"b1a1bf7e-168b-45a3-b91f-9b131130b358\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.981925 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-client-ca\") pod \"b1a1bf7e-168b-45a3-b91f-9b131130b358\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.981957 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-config\") pod \"b1a1bf7e-168b-45a3-b91f-9b131130b358\" (UID: \"b1a1bf7e-168b-45a3-b91f-9b131130b358\") " Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.982789 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b1a1bf7e-168b-45a3-b91f-9b131130b358" (UID: "b1a1bf7e-168b-45a3-b91f-9b131130b358"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.982796 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-client-ca" (OuterVolumeSpecName: "client-ca") pod "b1a1bf7e-168b-45a3-b91f-9b131130b358" (UID: "b1a1bf7e-168b-45a3-b91f-9b131130b358"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.982955 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-config" (OuterVolumeSpecName: "config") pod "b1a1bf7e-168b-45a3-b91f-9b131130b358" (UID: "b1a1bf7e-168b-45a3-b91f-9b131130b358"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.988076 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a1bf7e-168b-45a3-b91f-9b131130b358-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b1a1bf7e-168b-45a3-b91f-9b131130b358" (UID: "b1a1bf7e-168b-45a3-b91f-9b131130b358"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 10:58:19 crc kubenswrapper[4957]: I0123 10:58:19.991879 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a1bf7e-168b-45a3-b91f-9b131130b358-kube-api-access-jk7wf" (OuterVolumeSpecName: "kube-api-access-jk7wf") pod "b1a1bf7e-168b-45a3-b91f-9b131130b358" (UID: "b1a1bf7e-168b-45a3-b91f-9b131130b358"). InnerVolumeSpecName "kube-api-access-jk7wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.083868 4957 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a1bf7e-168b-45a3-b91f-9b131130b358-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.083918 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk7wf\" (UniqueName: \"kubernetes.io/projected/b1a1bf7e-168b-45a3-b91f-9b131130b358-kube-api-access-jk7wf\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.083937 4957 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-client-ca\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.083952 4957 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-config\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.083968 4957 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1a1bf7e-168b-45a3-b91f-9b131130b358-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.358563 4957 generic.go:334] "Generic (PLEG): container finished" podID="b1a1bf7e-168b-45a3-b91f-9b131130b358" containerID="209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7" exitCode=0 Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.358597 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" event={"ID":"b1a1bf7e-168b-45a3-b91f-9b131130b358","Type":"ContainerDied","Data":"209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7"} Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.358628 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" event={"ID":"b1a1bf7e-168b-45a3-b91f-9b131130b358","Type":"ContainerDied","Data":"8676d3cd6ff2b1789a289fbf22c0f648ee23e492e87928c5949d6bdaa75fc7fd"} Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.358624 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74577df4c5-62dgw" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.358644 4957 scope.go:117] "RemoveContainer" containerID="209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.371505 4957 scope.go:117] "RemoveContainer" containerID="209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7" Jan 23 10:58:20 crc kubenswrapper[4957]: E0123 10:58:20.371963 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7\": container with ID starting with 209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7 not found: ID does not exist" containerID="209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.372020 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7"} err="failed to get container status \"209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7\": rpc error: code = NotFound desc = could not find container \"209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7\": container with ID starting with 209516111bf4372577e31a780055ac94a8a81ecc441964d747a8c78d3cd7dea7 not found: ID does not exist" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.390302 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74577df4c5-62dgw"] Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.393686 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74577df4c5-62dgw"] Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.714475 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-zrl45"] Jan 23 10:58:20 crc kubenswrapper[4957]: E0123 10:58:20.715794 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a1bf7e-168b-45a3-b91f-9b131130b358" containerName="controller-manager" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.715845 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a1bf7e-168b-45a3-b91f-9b131130b358" containerName="controller-manager" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.716347 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a1bf7e-168b-45a3-b91f-9b131130b358" containerName="controller-manager" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.717066 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.720297 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.720575 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.720803 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.720827 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.720826 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.721332 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.733068 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-zrl45"] Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.734041 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.797453 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a1bf7e-168b-45a3-b91f-9b131130b358" path="/var/lib/kubelet/pods/b1a1bf7e-168b-45a3-b91f-9b131130b358/volumes" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.892425 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qb5m\" (UniqueName: \"kubernetes.io/projected/60c3cc87-0660-4795-90b5-cc474f0c175a-kube-api-access-6qb5m\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.892716 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c3cc87-0660-4795-90b5-cc474f0c175a-serving-cert\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.892791 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-proxy-ca-bundles\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.892861 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-config\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.892899 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-client-ca\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.994652 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-proxy-ca-bundles\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.995058 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-config\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.995086 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-client-ca\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.995124 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qb5m\" (UniqueName: \"kubernetes.io/projected/60c3cc87-0660-4795-90b5-cc474f0c175a-kube-api-access-6qb5m\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.995192 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c3cc87-0660-4795-90b5-cc474f0c175a-serving-cert\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.996100 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-proxy-ca-bundles\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.996117 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-client-ca\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:20 crc kubenswrapper[4957]: I0123 10:58:20.998188 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c3cc87-0660-4795-90b5-cc474f0c175a-config\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:21 crc kubenswrapper[4957]: I0123 10:58:21.001488 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c3cc87-0660-4795-90b5-cc474f0c175a-serving-cert\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:21 crc kubenswrapper[4957]: I0123 10:58:21.016787 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qb5m\" (UniqueName: \"kubernetes.io/projected/60c3cc87-0660-4795-90b5-cc474f0c175a-kube-api-access-6qb5m\") pod \"controller-manager-b99f7d5cb-zrl45\" (UID: \"60c3cc87-0660-4795-90b5-cc474f0c175a\") " pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:21 crc kubenswrapper[4957]: I0123 10:58:21.091808 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:21 crc kubenswrapper[4957]: I0123 10:58:21.310416 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b99f7d5cb-zrl45"] Jan 23 10:58:21 crc kubenswrapper[4957]: I0123 10:58:21.367479 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" event={"ID":"60c3cc87-0660-4795-90b5-cc474f0c175a","Type":"ContainerStarted","Data":"3310ff0a42d577dbdf288483162b8bb9fc3df83d7459fdfc32d480f8d19beb18"} Jan 23 10:58:22 crc kubenswrapper[4957]: I0123 10:58:22.373856 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" event={"ID":"60c3cc87-0660-4795-90b5-cc474f0c175a","Type":"ContainerStarted","Data":"634bf3d29dd0443908d9c47f6f5a334601294fb412814470adacb88a473ec553"} Jan 23 10:58:22 crc kubenswrapper[4957]: I0123 10:58:22.374215 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:22 crc kubenswrapper[4957]: I0123 10:58:22.378948 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" Jan 23 10:58:22 crc kubenswrapper[4957]: I0123 10:58:22.392366 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b99f7d5cb-zrl45" podStartSLOduration=3.39234803 podStartE2EDuration="3.39234803s" podCreationTimestamp="2026-01-23 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 10:58:22.390752451 +0000 UTC m=+411.928005138" watchObservedRunningTime="2026-01-23 10:58:22.39234803 +0000 UTC m=+411.929600717" Jan 23 10:58:30 crc kubenswrapper[4957]: I0123 10:58:30.991001 4957 scope.go:117] "RemoveContainer" containerID="4b6915f908509c8609290327ffc2dccf0e5680dc227979285a7ebaca4643cb7a" Jan 23 10:58:31 crc kubenswrapper[4957]: I0123 10:58:31.014759 4957 scope.go:117] "RemoveContainer" containerID="da77583099215643577c5d064d67ce2cca9d0b74e7ba7c88f3a948a8516fd66c" Jan 23 10:58:31 crc kubenswrapper[4957]: I0123 10:58:31.035486 4957 scope.go:117] "RemoveContainer" containerID="dd61e9841e88c0a34764491cfafe3e4f94233e4ac0543a3e6ed1326dd8d7280d" Jan 23 10:58:31 crc kubenswrapper[4957]: I0123 10:58:31.052150 4957 scope.go:117] "RemoveContainer" containerID="8cdf71b1a8491d3a4853fde19a5b1af1eb4697cbf07de482e22a52704ba0470f" Jan 23 10:58:31 crc kubenswrapper[4957]: I0123 10:58:31.074221 4957 scope.go:117] "RemoveContainer" containerID="52eea4c3c7c3b8898e64dd0eb05c1883ea1c2fa94e7e606f3ab48bbf5aaee8d3" Jan 23 10:58:31 crc kubenswrapper[4957]: I0123 10:58:31.091155 4957 scope.go:117] "RemoveContainer" containerID="9f405b6b517d30a201b793965bd82536f496d62b89562cefc7e3a9d9f7829633" Jan 23 10:58:45 crc kubenswrapper[4957]: I0123 10:58:45.717348 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 10:58:45 crc kubenswrapper[4957]: I0123 10:58:45.718205 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 10:58:45 crc kubenswrapper[4957]: I0123 10:58:45.718312 4957 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 10:58:45 crc kubenswrapper[4957]: I0123 10:58:45.719125 4957 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"380c1560a9b50de2450e83ba786578aa9f30c79645bd086692c358cef7dcbaf6"} pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 23 10:58:45 crc kubenswrapper[4957]: I0123 10:58:45.719203 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" containerID="cri-o://380c1560a9b50de2450e83ba786578aa9f30c79645bd086692c358cef7dcbaf6" gracePeriod=600 Jan 23 10:58:46 crc kubenswrapper[4957]: I0123 10:58:46.523338 4957 generic.go:334] "Generic (PLEG): container finished" podID="224e3211-1f68-4673-8975-7e71b1e513d0" containerID="380c1560a9b50de2450e83ba786578aa9f30c79645bd086692c358cef7dcbaf6" exitCode=0 Jan 23 10:58:46 crc kubenswrapper[4957]: I0123 10:58:46.523444 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerDied","Data":"380c1560a9b50de2450e83ba786578aa9f30c79645bd086692c358cef7dcbaf6"} Jan 23 10:58:46 crc kubenswrapper[4957]: I0123 10:58:46.523646 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"303ca83532cca99052083264393534af7ab68b89a646ac04970e5eb8fdb50844"} Jan 23 10:58:46 crc kubenswrapper[4957]: I0123 10:58:46.523672 4957 scope.go:117] "RemoveContainer" containerID="f355e8990ff693448c7b8df392b7b2caeb59d6fee6cf8d5d4200f8ce1b5e03ae" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.187547 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs"] Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.189251 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.191190 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.191618 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.199762 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs"] Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.310993 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-config-volume\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.311092 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-secret-volume\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.311127 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzl7p\" (UniqueName: \"kubernetes.io/projected/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-kube-api-access-zzl7p\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.412484 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-secret-volume\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.412600 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzl7p\" (UniqueName: \"kubernetes.io/projected/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-kube-api-access-zzl7p\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.412720 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-config-volume\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.414609 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-config-volume\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.418398 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-secret-volume\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.430528 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzl7p\" (UniqueName: \"kubernetes.io/projected/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-kube-api-access-zzl7p\") pod \"collect-profiles-29486100-bvdhs\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.515637 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.909569 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs"] Jan 23 11:00:00 crc kubenswrapper[4957]: I0123 11:00:00.993902 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" event={"ID":"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46","Type":"ContainerStarted","Data":"aa57206c240db81fe2c9e4938ba4ff20f81cfba804d8fc1733a3c8ec3c5af572"} Jan 23 11:00:02 crc kubenswrapper[4957]: I0123 11:00:02.001321 4957 generic.go:334] "Generic (PLEG): container finished" podID="4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46" containerID="52214c5f7ac4004090259940b2cbc0232506b4099623edd403858293d0ae26c2" exitCode=0 Jan 23 11:00:02 crc kubenswrapper[4957]: I0123 11:00:02.001424 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" event={"ID":"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46","Type":"ContainerDied","Data":"52214c5f7ac4004090259940b2cbc0232506b4099623edd403858293d0ae26c2"} Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.205421 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.344824 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-secret-volume\") pod \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.344864 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzl7p\" (UniqueName: \"kubernetes.io/projected/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-kube-api-access-zzl7p\") pod \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.344946 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-config-volume\") pod \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\" (UID: \"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46\") " Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.345486 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-config-volume" (OuterVolumeSpecName: "config-volume") pod "4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46" (UID: "4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.349225 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46" (UID: "4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.349508 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-kube-api-access-zzl7p" (OuterVolumeSpecName: "kube-api-access-zzl7p") pod "4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46" (UID: "4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46"). InnerVolumeSpecName "kube-api-access-zzl7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.446402 4957 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-config-volume\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.446445 4957 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:03 crc kubenswrapper[4957]: I0123 11:00:03.446457 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzl7p\" (UniqueName: \"kubernetes.io/projected/4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46-kube-api-access-zzl7p\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:04 crc kubenswrapper[4957]: I0123 11:00:04.012294 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" event={"ID":"4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46","Type":"ContainerDied","Data":"aa57206c240db81fe2c9e4938ba4ff20f81cfba804d8fc1733a3c8ec3c5af572"} Jan 23 11:00:04 crc kubenswrapper[4957]: I0123 11:00:04.012347 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa57206c240db81fe2c9e4938ba4ff20f81cfba804d8fc1733a3c8ec3c5af572" Jan 23 11:00:04 crc kubenswrapper[4957]: I0123 11:00:04.012366 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29486100-bvdhs" Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.503558 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8hcw"] Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.504960 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-controller" containerID="cri-o://26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d" gracePeriod=30 Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.505507 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="sbdb" containerID="cri-o://1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad" gracePeriod=30 Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.505580 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="nbdb" containerID="cri-o://7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4" gracePeriod=30 Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.505659 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="northd" containerID="cri-o://0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b" gracePeriod=30 Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.505748 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262" gracePeriod=30 Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.505807 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-node" containerID="cri-o://1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f" gracePeriod=30 Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.505863 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-acl-logging" containerID="cri-o://d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01" gracePeriod=30 Jan 23 11:00:18 crc kubenswrapper[4957]: I0123 11:00:18.550323 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" containerID="cri-o://084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3" gracePeriod=30 Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.119436 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/2.log" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.120000 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/1.log" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.120138 4957 generic.go:334] "Generic (PLEG): container finished" podID="233fdd78-4010-4fe8-9068-ee47d8ff25d1" containerID="8e5c8d9deccce35da00836243a4c325855cbf167f74a231795eea7aff84803a4" exitCode=2 Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.120234 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerDied","Data":"8e5c8d9deccce35da00836243a4c325855cbf167f74a231795eea7aff84803a4"} Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.120320 4957 scope.go:117] "RemoveContainer" containerID="a41f7e81b1359b374160b43aed747c1058d4a086980d803825ae41e507f3d77c" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.120715 4957 scope.go:117] "RemoveContainer" containerID="8e5c8d9deccce35da00836243a4c325855cbf167f74a231795eea7aff84803a4" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.120888 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tlz2g_openshift-multus(233fdd78-4010-4fe8-9068-ee47d8ff25d1)\"" pod="openshift-multus/multus-tlz2g" podUID="233fdd78-4010-4fe8-9068-ee47d8ff25d1" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.129566 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovnkube-controller/3.log" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.132378 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovn-acl-logging/0.log" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133050 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovn-controller/0.log" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133847 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3" exitCode=0 Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133888 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262" exitCode=0 Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133899 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f" exitCode=0 Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133909 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01" exitCode=143 Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133920 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d" exitCode=143 Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133948 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3"} Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.133985 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262"} Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.134002 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f"} Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.134017 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01"} Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.134055 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d"} Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.172851 4957 scope.go:117] "RemoveContainer" containerID="6cbd0fe66fb090078f66ddc5174cf5273cbe2b54ca7beb8afcf6de97c848666e" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.288052 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovn-acl-logging/0.log" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.288539 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovn-controller/0.log" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.289242 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349159 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wpvl5"] Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349407 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46" containerName="collect-profiles" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349422 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46" containerName="collect-profiles" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349435 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349443 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349452 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="northd" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349459 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="northd" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349470 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-ovn-metrics" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349478 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-ovn-metrics" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349492 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="nbdb" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349500 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="nbdb" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349511 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349519 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349529 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-node" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349536 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-node" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349545 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349552 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349564 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349571 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349582 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kubecfg-setup" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349589 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kubecfg-setup" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349597 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349605 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349617 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="sbdb" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349624 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="sbdb" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349634 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-acl-logging" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349642 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-acl-logging" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349751 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-ovn-metrics" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349768 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-acl-logging" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349778 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="kube-rbac-proxy-node" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349789 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349797 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="sbdb" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349808 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="northd" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349816 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="nbdb" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349827 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b397d51-7cfa-4ba1-9e00-3f9f7a1a7c46" containerName="collect-profiles" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349838 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349849 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovn-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349858 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: E0123 11:00:19.349966 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.349976 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.350087 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.350102 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerName="ovnkube-controller" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.352134 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459049 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-systemd-units\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459113 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459144 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-node-log\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459185 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-script-lib\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459214 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-config\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459249 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-var-lib-openvswitch\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459296 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-systemd\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459321 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-env-overrides\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459344 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-log-socket\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459367 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-ovn\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459389 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-openvswitch\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459450 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-ovn-kubernetes\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459486 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovn-node-metrics-cert\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459515 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-etc-openvswitch\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459550 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-bin\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459593 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-slash\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459632 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-kubelet\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459675 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nhgm\" (UniqueName: \"kubernetes.io/projected/87adc28a-89e3-4743-a9f2-098d4a9432d8-kube-api-access-8nhgm\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459700 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-netns\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459724 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-netd\") pod \"87adc28a-89e3-4743-a9f2-098d4a9432d8\" (UID: \"87adc28a-89e3-4743-a9f2-098d4a9432d8\") " Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459855 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7bdp\" (UniqueName: \"kubernetes.io/projected/87c71c69-ce2f-4e09-bfc0-3e0258676386-kube-api-access-z7bdp\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459196 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459886 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-run-ovn-kubernetes\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460112 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459244 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-node-log" (OuterVolumeSpecName: "node-log") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460128 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-etc-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460222 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-kubelet\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460260 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-systemd\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460324 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovn-node-metrics-cert\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460392 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-env-overrides\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460430 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460479 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-var-lib-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460506 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460542 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-ovn\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460590 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-cni-netd\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460630 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-node-log\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460662 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovnkube-config\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460721 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovnkube-script-lib\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460754 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-run-netns\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460785 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-cni-bin\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460847 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-systemd-units\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460883 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-log-socket\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460915 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-slash\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.461003 4957 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.461030 4957 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-node-log\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.461048 4957 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459237 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459762 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459785 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459796 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459817 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-log-socket" (OuterVolumeSpecName: "log-socket") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459835 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459851 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.459873 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460186 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460204 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460220 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460236 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-slash" (OuterVolumeSpecName: "host-slash") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.460257 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.461082 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.466834 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87adc28a-89e3-4743-a9f2-098d4a9432d8-kube-api-access-8nhgm" (OuterVolumeSpecName: "kube-api-access-8nhgm") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "kube-api-access-8nhgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.469095 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.481892 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "87adc28a-89e3-4743-a9f2-098d4a9432d8" (UID: "87adc28a-89e3-4743-a9f2-098d4a9432d8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562007 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7bdp\" (UniqueName: \"kubernetes.io/projected/87c71c69-ce2f-4e09-bfc0-3e0258676386-kube-api-access-z7bdp\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562071 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-run-ovn-kubernetes\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562118 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-etc-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562153 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-kubelet\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562180 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-systemd\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562221 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovn-node-metrics-cert\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562327 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-env-overrides\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562359 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562395 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562422 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-var-lib-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562454 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-ovn\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562488 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-cni-netd\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562521 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-node-log\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562547 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovnkube-config\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562588 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovnkube-script-lib\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562617 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-cni-bin\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562645 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-run-netns\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562686 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-systemd-units\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562716 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-slash\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562742 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-log-socket\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562805 4957 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562829 4957 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562846 4957 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562863 4957 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562879 4957 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562894 4957 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-log-socket\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562910 4957 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562926 4957 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562941 4957 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562959 4957 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87adc28a-89e3-4743-a9f2-098d4a9432d8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562974 4957 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.562989 4957 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563006 4957 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-slash\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563037 4957 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563053 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nhgm\" (UniqueName: \"kubernetes.io/projected/87adc28a-89e3-4743-a9f2-098d4a9432d8-kube-api-access-8nhgm\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563069 4957 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563086 4957 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87adc28a-89e3-4743-a9f2-098d4a9432d8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563145 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-log-socket\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563402 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-var-lib-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563548 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-run-ovn-kubernetes\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563560 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-ovn\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563599 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-cni-netd\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563660 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-node-log\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.563663 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-systemd\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564009 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-run-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564032 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-systemd-units\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564088 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-etc-openvswitch\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564126 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-run-netns\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564168 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-kubelet\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564226 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564315 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-cni-bin\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564754 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovnkube-config\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564890 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-env-overrides\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.564983 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87c71c69-ce2f-4e09-bfc0-3e0258676386-host-slash\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.565480 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovnkube-script-lib\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.570878 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87c71c69-ce2f-4e09-bfc0-3e0258676386-ovn-node-metrics-cert\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.590123 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7bdp\" (UniqueName: \"kubernetes.io/projected/87c71c69-ce2f-4e09-bfc0-3e0258676386-kube-api-access-z7bdp\") pod \"ovnkube-node-wpvl5\" (UID: \"87c71c69-ce2f-4e09-bfc0-3e0258676386\") " pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:19 crc kubenswrapper[4957]: I0123 11:00:19.678109 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.142217 4957 generic.go:334] "Generic (PLEG): container finished" podID="87c71c69-ce2f-4e09-bfc0-3e0258676386" containerID="77809dbcea65a4a29e24ab144cc4d3e51304ce023f14d6aad18cb10baaf0f41b" exitCode=0 Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.142373 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerDied","Data":"77809dbcea65a4a29e24ab144cc4d3e51304ce023f14d6aad18cb10baaf0f41b"} Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.142478 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"dc922c6cdf04e301a48aefab939910535a5c8242d3ef7a8e7fd4b99fe01a2ebd"} Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.149083 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovn-acl-logging/0.log" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.149747 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8hcw_87adc28a-89e3-4743-a9f2-098d4a9432d8/ovn-controller/0.log" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150245 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad" exitCode=0 Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150323 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4" exitCode=0 Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150323 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad"} Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150376 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4"} Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150390 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b"} Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150409 4957 scope.go:117] "RemoveContainer" containerID="084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150412 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150342 4957 generic.go:334] "Generic (PLEG): container finished" podID="87adc28a-89e3-4743-a9f2-098d4a9432d8" containerID="0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b" exitCode=0 Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.150619 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8hcw" event={"ID":"87adc28a-89e3-4743-a9f2-098d4a9432d8","Type":"ContainerDied","Data":"7e83039e1bb245b00fe052661f4e75b110fe0bb66a8a851421d6385c4316b9ba"} Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.153555 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/2.log" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.171765 4957 scope.go:117] "RemoveContainer" containerID="1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.205317 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8hcw"] Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.210703 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8hcw"] Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.215544 4957 scope.go:117] "RemoveContainer" containerID="7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.232580 4957 scope.go:117] "RemoveContainer" containerID="0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.262996 4957 scope.go:117] "RemoveContainer" containerID="0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.278122 4957 scope.go:117] "RemoveContainer" containerID="1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.294340 4957 scope.go:117] "RemoveContainer" containerID="d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.317537 4957 scope.go:117] "RemoveContainer" containerID="26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.331561 4957 scope.go:117] "RemoveContainer" containerID="3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.345669 4957 scope.go:117] "RemoveContainer" containerID="084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.345993 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3\": container with ID starting with 084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3 not found: ID does not exist" containerID="084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.346079 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3"} err="failed to get container status \"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3\": rpc error: code = NotFound desc = could not find container \"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3\": container with ID starting with 084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.346161 4957 scope.go:117] "RemoveContainer" containerID="1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.346444 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\": container with ID starting with 1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad not found: ID does not exist" containerID="1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.346599 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad"} err="failed to get container status \"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\": rpc error: code = NotFound desc = could not find container \"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\": container with ID starting with 1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.346747 4957 scope.go:117] "RemoveContainer" containerID="7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.347173 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\": container with ID starting with 7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4 not found: ID does not exist" containerID="7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.347218 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4"} err="failed to get container status \"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\": rpc error: code = NotFound desc = could not find container \"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\": container with ID starting with 7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.347232 4957 scope.go:117] "RemoveContainer" containerID="0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.348435 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\": container with ID starting with 0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b not found: ID does not exist" containerID="0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.349351 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b"} err="failed to get container status \"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\": rpc error: code = NotFound desc = could not find container \"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\": container with ID starting with 0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.349439 4957 scope.go:117] "RemoveContainer" containerID="0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.349792 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\": container with ID starting with 0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262 not found: ID does not exist" containerID="0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.349839 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262"} err="failed to get container status \"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\": rpc error: code = NotFound desc = could not find container \"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\": container with ID starting with 0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.349867 4957 scope.go:117] "RemoveContainer" containerID="1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.350121 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\": container with ID starting with 1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f not found: ID does not exist" containerID="1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.350188 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f"} err="failed to get container status \"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\": rpc error: code = NotFound desc = could not find container \"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\": container with ID starting with 1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.350251 4957 scope.go:117] "RemoveContainer" containerID="d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.350516 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\": container with ID starting with d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01 not found: ID does not exist" containerID="d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.350589 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01"} err="failed to get container status \"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\": rpc error: code = NotFound desc = could not find container \"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\": container with ID starting with d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.350692 4957 scope.go:117] "RemoveContainer" containerID="26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.351028 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\": container with ID starting with 26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d not found: ID does not exist" containerID="26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.351103 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d"} err="failed to get container status \"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\": rpc error: code = NotFound desc = could not find container \"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\": container with ID starting with 26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.351163 4957 scope.go:117] "RemoveContainer" containerID="3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b" Jan 23 11:00:20 crc kubenswrapper[4957]: E0123 11:00:20.351441 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\": container with ID starting with 3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b not found: ID does not exist" containerID="3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.351527 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b"} err="failed to get container status \"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\": rpc error: code = NotFound desc = could not find container \"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\": container with ID starting with 3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.351597 4957 scope.go:117] "RemoveContainer" containerID="084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.351828 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3"} err="failed to get container status \"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3\": rpc error: code = NotFound desc = could not find container \"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3\": container with ID starting with 084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.351897 4957 scope.go:117] "RemoveContainer" containerID="1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.352155 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad"} err="failed to get container status \"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\": rpc error: code = NotFound desc = could not find container \"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\": container with ID starting with 1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.352228 4957 scope.go:117] "RemoveContainer" containerID="7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.352508 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4"} err="failed to get container status \"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\": rpc error: code = NotFound desc = could not find container \"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\": container with ID starting with 7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.352585 4957 scope.go:117] "RemoveContainer" containerID="0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.352817 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b"} err="failed to get container status \"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\": rpc error: code = NotFound desc = could not find container \"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\": container with ID starting with 0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.352892 4957 scope.go:117] "RemoveContainer" containerID="0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.353130 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262"} err="failed to get container status \"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\": rpc error: code = NotFound desc = could not find container \"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\": container with ID starting with 0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.353216 4957 scope.go:117] "RemoveContainer" containerID="1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.353458 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f"} err="failed to get container status \"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\": rpc error: code = NotFound desc = could not find container \"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\": container with ID starting with 1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.353526 4957 scope.go:117] "RemoveContainer" containerID="d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.353883 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01"} err="failed to get container status \"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\": rpc error: code = NotFound desc = could not find container \"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\": container with ID starting with d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.353980 4957 scope.go:117] "RemoveContainer" containerID="26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.354196 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d"} err="failed to get container status \"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\": rpc error: code = NotFound desc = could not find container \"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\": container with ID starting with 26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.354274 4957 scope.go:117] "RemoveContainer" containerID="3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.354707 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b"} err="failed to get container status \"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\": rpc error: code = NotFound desc = could not find container \"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\": container with ID starting with 3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.354799 4957 scope.go:117] "RemoveContainer" containerID="084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.355126 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3"} err="failed to get container status \"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3\": rpc error: code = NotFound desc = could not find container \"084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3\": container with ID starting with 084ff950e9d3aef7d2a76aab2117ab6a65b6f845927f4cbf12fe8cbd8e56c3a3 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.355217 4957 scope.go:117] "RemoveContainer" containerID="1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.355553 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad"} err="failed to get container status \"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\": rpc error: code = NotFound desc = could not find container \"1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad\": container with ID starting with 1be5b459e3fae28da165ef0ee506ec5ccd39026d7b7e7c35a3f242c65d60d0ad not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.355626 4957 scope.go:117] "RemoveContainer" containerID="7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.355891 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4"} err="failed to get container status \"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\": rpc error: code = NotFound desc = could not find container \"7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4\": container with ID starting with 7f862c6f11fe904458a8ecde92079c0b4aa4a9cb4dfc6f2ca094a1d3142570d4 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.355966 4957 scope.go:117] "RemoveContainer" containerID="0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.356310 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b"} err="failed to get container status \"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\": rpc error: code = NotFound desc = could not find container \"0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b\": container with ID starting with 0efa75cf10a812bc4de5b071048558eb5f48828f6fb3049f3820fe5e0b7e2b0b not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.356415 4957 scope.go:117] "RemoveContainer" containerID="0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.356891 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262"} err="failed to get container status \"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\": rpc error: code = NotFound desc = could not find container \"0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262\": container with ID starting with 0ca18567eec1b0cc34d911b28d9f3d670a061722086817f58236f6a0da557262 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.356960 4957 scope.go:117] "RemoveContainer" containerID="1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.357242 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f"} err="failed to get container status \"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\": rpc error: code = NotFound desc = could not find container \"1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f\": container with ID starting with 1a14cf2687aa7c7a4c43dffbc2ad99a41aef0e46719171f63c7f769ee2d54e4f not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.357332 4957 scope.go:117] "RemoveContainer" containerID="d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.357607 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01"} err="failed to get container status \"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\": rpc error: code = NotFound desc = could not find container \"d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01\": container with ID starting with d1a90ac89ce8ac710e5f8cff26e69aff44a735ef8155a7e93324809904a33e01 not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.357684 4957 scope.go:117] "RemoveContainer" containerID="26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.357989 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d"} err="failed to get container status \"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\": rpc error: code = NotFound desc = could not find container \"26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d\": container with ID starting with 26b4bdc4f2514902dc2c95df59af4c954a5c1905821f5981e9437ff54d6d544d not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.358064 4957 scope.go:117] "RemoveContainer" containerID="3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.358378 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b"} err="failed to get container status \"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\": rpc error: code = NotFound desc = could not find container \"3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b\": container with ID starting with 3de3a4f166173fb3f0ff7428b1d1ddf43a9988749fc445ecff4a7527bd1f3a4b not found: ID does not exist" Jan 23 11:00:20 crc kubenswrapper[4957]: I0123 11:00:20.782466 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87adc28a-89e3-4743-a9f2-098d4a9432d8" path="/var/lib/kubelet/pods/87adc28a-89e3-4743-a9f2-098d4a9432d8/volumes" Jan 23 11:00:21 crc kubenswrapper[4957]: I0123 11:00:21.163252 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"9ecc8c70cffa8fe14e10cb20ed3939ef6b8713653072345d01e58a88eb1f1555"} Jan 23 11:00:21 crc kubenswrapper[4957]: I0123 11:00:21.163312 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"de8db5514c14e94acd83b5b534ee4e1366a02617084b48b6135cf65bab9b9dda"} Jan 23 11:00:21 crc kubenswrapper[4957]: I0123 11:00:21.163325 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"b69cc577920cda64e8fc33da03a78644ac0e8ba43b1a2d56c47b13fede3ed5fb"} Jan 23 11:00:21 crc kubenswrapper[4957]: I0123 11:00:21.163336 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"96e48046329ab4e794071dec78156ba1abdf2a2e8b1751de1893282ecdbfd3cb"} Jan 23 11:00:21 crc kubenswrapper[4957]: I0123 11:00:21.163345 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"869b1c75ba4dc3d414fc6507a2960e12d0e47abf96b6848e1f5e3e5a59e5fcba"} Jan 23 11:00:21 crc kubenswrapper[4957]: I0123 11:00:21.163353 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"5367919c150b7ebcef26fecf3d1420d563148f734a3d1b669a88649fa6a4a88b"} Jan 23 11:00:23 crc kubenswrapper[4957]: I0123 11:00:23.182747 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"f33ab2482ce663db123321396c97b4f20690f4fa652b9e6542f2af9ce53b326c"} Jan 23 11:00:26 crc kubenswrapper[4957]: I0123 11:00:26.203581 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" event={"ID":"87c71c69-ce2f-4e09-bfc0-3e0258676386","Type":"ContainerStarted","Data":"e31421c62d001e3ff634c213390756bfb60e5adf9dff6f9eb01f2464ff9637bf"} Jan 23 11:00:26 crc kubenswrapper[4957]: I0123 11:00:26.204151 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:26 crc kubenswrapper[4957]: I0123 11:00:26.204168 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:26 crc kubenswrapper[4957]: I0123 11:00:26.204180 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:26 crc kubenswrapper[4957]: I0123 11:00:26.265538 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:26 crc kubenswrapper[4957]: I0123 11:00:26.272011 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:00:26 crc kubenswrapper[4957]: I0123 11:00:26.276296 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" podStartSLOduration=7.276263538 podStartE2EDuration="7.276263538s" podCreationTimestamp="2026-01-23 11:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 11:00:26.271984507 +0000 UTC m=+535.809237204" watchObservedRunningTime="2026-01-23 11:00:26.276263538 +0000 UTC m=+535.813516225" Jan 23 11:00:33 crc kubenswrapper[4957]: I0123 11:00:33.770123 4957 scope.go:117] "RemoveContainer" containerID="8e5c8d9deccce35da00836243a4c325855cbf167f74a231795eea7aff84803a4" Jan 23 11:00:33 crc kubenswrapper[4957]: E0123 11:00:33.772087 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tlz2g_openshift-multus(233fdd78-4010-4fe8-9068-ee47d8ff25d1)\"" pod="openshift-multus/multus-tlz2g" podUID="233fdd78-4010-4fe8-9068-ee47d8ff25d1" Jan 23 11:00:44 crc kubenswrapper[4957]: I0123 11:00:44.770541 4957 scope.go:117] "RemoveContainer" containerID="8e5c8d9deccce35da00836243a4c325855cbf167f74a231795eea7aff84803a4" Jan 23 11:00:45 crc kubenswrapper[4957]: I0123 11:00:45.329222 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlz2g_233fdd78-4010-4fe8-9068-ee47d8ff25d1/kube-multus/2.log" Jan 23 11:00:45 crc kubenswrapper[4957]: I0123 11:00:45.329585 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlz2g" event={"ID":"233fdd78-4010-4fe8-9068-ee47d8ff25d1","Type":"ContainerStarted","Data":"664dcbf866cfa7befa53b8dda39ca88d4b0fc18af28598e44ac8f69b62c9b3ab"} Jan 23 11:00:49 crc kubenswrapper[4957]: I0123 11:00:49.710432 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wpvl5" Jan 23 11:01:15 crc kubenswrapper[4957]: I0123 11:01:15.717073 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:01:15 crc kubenswrapper[4957]: I0123 11:01:15.717901 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:01:21 crc kubenswrapper[4957]: I0123 11:01:21.480175 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mm4q6"] Jan 23 11:01:21 crc kubenswrapper[4957]: I0123 11:01:21.480896 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mm4q6" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="registry-server" containerID="cri-o://472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b" gracePeriod=30 Jan 23 11:01:21 crc kubenswrapper[4957]: I0123 11:01:21.846677 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.017559 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-utilities\") pod \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.017626 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rc7x\" (UniqueName: \"kubernetes.io/projected/0aee337e-503a-46b7-8c0b-4e69a7618a9b-kube-api-access-2rc7x\") pod \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.017654 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-catalog-content\") pod \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\" (UID: \"0aee337e-503a-46b7-8c0b-4e69a7618a9b\") " Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.019590 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-utilities" (OuterVolumeSpecName: "utilities") pod "0aee337e-503a-46b7-8c0b-4e69a7618a9b" (UID: "0aee337e-503a-46b7-8c0b-4e69a7618a9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.027573 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aee337e-503a-46b7-8c0b-4e69a7618a9b-kube-api-access-2rc7x" (OuterVolumeSpecName: "kube-api-access-2rc7x") pod "0aee337e-503a-46b7-8c0b-4e69a7618a9b" (UID: "0aee337e-503a-46b7-8c0b-4e69a7618a9b"). InnerVolumeSpecName "kube-api-access-2rc7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.059198 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0aee337e-503a-46b7-8c0b-4e69a7618a9b" (UID: "0aee337e-503a-46b7-8c0b-4e69a7618a9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.119735 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.119791 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rc7x\" (UniqueName: \"kubernetes.io/projected/0aee337e-503a-46b7-8c0b-4e69a7618a9b-kube-api-access-2rc7x\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.119811 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aee337e-503a-46b7-8c0b-4e69a7618a9b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.555451 4957 generic.go:334] "Generic (PLEG): container finished" podID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerID="472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b" exitCode=0 Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.555499 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm4q6" event={"ID":"0aee337e-503a-46b7-8c0b-4e69a7618a9b","Type":"ContainerDied","Data":"472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b"} Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.555527 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm4q6" event={"ID":"0aee337e-503a-46b7-8c0b-4e69a7618a9b","Type":"ContainerDied","Data":"dd19865ae7c9b539b0d686bed3c5f5d0d2b4571c1f1c28001cd9d6754b2d1345"} Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.555546 4957 scope.go:117] "RemoveContainer" containerID="472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.555504 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mm4q6" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.571257 4957 scope.go:117] "RemoveContainer" containerID="41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.582897 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mm4q6"] Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.585672 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mm4q6"] Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.586214 4957 scope.go:117] "RemoveContainer" containerID="a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.611050 4957 scope.go:117] "RemoveContainer" containerID="472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b" Jan 23 11:01:22 crc kubenswrapper[4957]: E0123 11:01:22.611714 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b\": container with ID starting with 472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b not found: ID does not exist" containerID="472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.611784 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b"} err="failed to get container status \"472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b\": rpc error: code = NotFound desc = could not find container \"472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b\": container with ID starting with 472c61ac92d78b14e720ab5a457372e7326356502517d57f2ff997d0204ed20b not found: ID does not exist" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.611833 4957 scope.go:117] "RemoveContainer" containerID="41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b" Jan 23 11:01:22 crc kubenswrapper[4957]: E0123 11:01:22.612238 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b\": container with ID starting with 41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b not found: ID does not exist" containerID="41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.612271 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b"} err="failed to get container status \"41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b\": rpc error: code = NotFound desc = could not find container \"41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b\": container with ID starting with 41b1611656ce6cad9daec18f7445e8c608a4bab0ab3fc1faed6e8b3e5f1efa5b not found: ID does not exist" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.612415 4957 scope.go:117] "RemoveContainer" containerID="a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801" Jan 23 11:01:22 crc kubenswrapper[4957]: E0123 11:01:22.613042 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801\": container with ID starting with a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801 not found: ID does not exist" containerID="a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.613092 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801"} err="failed to get container status \"a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801\": rpc error: code = NotFound desc = could not find container \"a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801\": container with ID starting with a0d3b1e74a75f86e28a5b83e352027bdd754fa3f854bf0550477ddb43dc65801 not found: ID does not exist" Jan 23 11:01:22 crc kubenswrapper[4957]: I0123 11:01:22.779931 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" path="/var/lib/kubelet/pods/0aee337e-503a-46b7-8c0b-4e69a7618a9b/volumes" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.208294 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7"] Jan 23 11:01:25 crc kubenswrapper[4957]: E0123 11:01:25.208529 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="extract-content" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.208545 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="extract-content" Jan 23 11:01:25 crc kubenswrapper[4957]: E0123 11:01:25.208556 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="extract-utilities" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.208563 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="extract-utilities" Jan 23 11:01:25 crc kubenswrapper[4957]: E0123 11:01:25.208579 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="registry-server" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.208586 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="registry-server" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.208711 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aee337e-503a-46b7-8c0b-4e69a7618a9b" containerName="registry-server" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.209583 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.212693 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.226258 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7"] Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.358361 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.358640 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.358792 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blnf9\" (UniqueName: \"kubernetes.io/projected/684bbccf-3dbd-475e-b034-dfb861ac18a0-kube-api-access-blnf9\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.459582 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.459642 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.459700 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blnf9\" (UniqueName: \"kubernetes.io/projected/684bbccf-3dbd-475e-b034-dfb861ac18a0-kube-api-access-blnf9\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.460092 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.460154 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.476971 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blnf9\" (UniqueName: \"kubernetes.io/projected/684bbccf-3dbd-475e-b034-dfb861ac18a0-kube-api-access-blnf9\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.527298 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:25 crc kubenswrapper[4957]: I0123 11:01:25.721100 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7"] Jan 23 11:01:26 crc kubenswrapper[4957]: I0123 11:01:26.579958 4957 generic.go:334] "Generic (PLEG): container finished" podID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerID="10057c53bc20f407fdce0bcc05725adc491417d89c43ae35c7007aa54de54c83" exitCode=0 Jan 23 11:01:26 crc kubenswrapper[4957]: I0123 11:01:26.580012 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" event={"ID":"684bbccf-3dbd-475e-b034-dfb861ac18a0","Type":"ContainerDied","Data":"10057c53bc20f407fdce0bcc05725adc491417d89c43ae35c7007aa54de54c83"} Jan 23 11:01:26 crc kubenswrapper[4957]: I0123 11:01:26.580253 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" event={"ID":"684bbccf-3dbd-475e-b034-dfb861ac18a0","Type":"ContainerStarted","Data":"3e6fbccdf938389aa8adf0721e4cf0b4f66a977808fa93222e78d2a11b1f7816"} Jan 23 11:01:26 crc kubenswrapper[4957]: I0123 11:01:26.581622 4957 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 23 11:01:28 crc kubenswrapper[4957]: I0123 11:01:28.593353 4957 generic.go:334] "Generic (PLEG): container finished" podID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerID="092e1407ef42f46fac9acc8f263bb07c2a31eea46bf6d80595d6d09f034c6766" exitCode=0 Jan 23 11:01:28 crc kubenswrapper[4957]: I0123 11:01:28.593533 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" event={"ID":"684bbccf-3dbd-475e-b034-dfb861ac18a0","Type":"ContainerDied","Data":"092e1407ef42f46fac9acc8f263bb07c2a31eea46bf6d80595d6d09f034c6766"} Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.605371 4957 generic.go:334] "Generic (PLEG): container finished" podID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerID="4cad50e1b9116439134a769b368bc6cc072d02cea4eae0a04244c17e285119b8" exitCode=0 Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.605503 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" event={"ID":"684bbccf-3dbd-475e-b034-dfb861ac18a0","Type":"ContainerDied","Data":"4cad50e1b9116439134a769b368bc6cc072d02cea4eae0a04244c17e285119b8"} Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.800853 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd"] Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.803385 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.820950 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.821145 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcgpz\" (UniqueName: \"kubernetes.io/projected/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-kube-api-access-hcgpz\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.821330 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.830247 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd"] Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.922382 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcgpz\" (UniqueName: \"kubernetes.io/projected/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-kube-api-access-hcgpz\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.922444 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.922499 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.922965 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.922990 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:29 crc kubenswrapper[4957]: I0123 11:01:29.946892 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcgpz\" (UniqueName: \"kubernetes.io/projected/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-kube-api-access-hcgpz\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.129859 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.371692 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd"] Jan 23 11:01:30 crc kubenswrapper[4957]: W0123 11:01:30.372161 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05351f1f_ef4e_4c7a_a093_3cec8b6f3f56.slice/crio-100981d69e0c39a09682501aabdc74c92821f74c9d243abe96ccd0ff92d2103d WatchSource:0}: Error finding container 100981d69e0c39a09682501aabdc74c92821f74c9d243abe96ccd0ff92d2103d: Status 404 returned error can't find the container with id 100981d69e0c39a09682501aabdc74c92821f74c9d243abe96ccd0ff92d2103d Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.613576 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" event={"ID":"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56","Type":"ContainerStarted","Data":"100981d69e0c39a09682501aabdc74c92821f74c9d243abe96ccd0ff92d2103d"} Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.806545 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.934710 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blnf9\" (UniqueName: \"kubernetes.io/projected/684bbccf-3dbd-475e-b034-dfb861ac18a0-kube-api-access-blnf9\") pod \"684bbccf-3dbd-475e-b034-dfb861ac18a0\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.934937 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-bundle\") pod \"684bbccf-3dbd-475e-b034-dfb861ac18a0\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.934986 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-util\") pod \"684bbccf-3dbd-475e-b034-dfb861ac18a0\" (UID: \"684bbccf-3dbd-475e-b034-dfb861ac18a0\") " Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.936989 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-bundle" (OuterVolumeSpecName: "bundle") pod "684bbccf-3dbd-475e-b034-dfb861ac18a0" (UID: "684bbccf-3dbd-475e-b034-dfb861ac18a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.940070 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684bbccf-3dbd-475e-b034-dfb861ac18a0-kube-api-access-blnf9" (OuterVolumeSpecName: "kube-api-access-blnf9") pod "684bbccf-3dbd-475e-b034-dfb861ac18a0" (UID: "684bbccf-3dbd-475e-b034-dfb861ac18a0"). InnerVolumeSpecName "kube-api-access-blnf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:01:30 crc kubenswrapper[4957]: I0123 11:01:30.952197 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-util" (OuterVolumeSpecName: "util") pod "684bbccf-3dbd-475e-b034-dfb861ac18a0" (UID: "684bbccf-3dbd-475e-b034-dfb861ac18a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.036373 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blnf9\" (UniqueName: \"kubernetes.io/projected/684bbccf-3dbd-475e-b034-dfb861ac18a0-kube-api-access-blnf9\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.036418 4957 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.036435 4957 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684bbccf-3dbd-475e-b034-dfb861ac18a0-util\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.179153 4957 scope.go:117] "RemoveContainer" containerID="a932ded0eec0316dbcf394a6add196e5056c48b5d2a44b51bb89191816ed38b0" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.621219 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" event={"ID":"684bbccf-3dbd-475e-b034-dfb861ac18a0","Type":"ContainerDied","Data":"3e6fbccdf938389aa8adf0721e4cf0b4f66a977808fa93222e78d2a11b1f7816"} Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.621258 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e6fbccdf938389aa8adf0721e4cf0b4f66a977808fa93222e78d2a11b1f7816" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.621258 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.623343 4957 generic.go:334] "Generic (PLEG): container finished" podID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerID="e8058b5cef581f6d4455bac18815254f0a779432955cce6011d2ddf95afa0eee" exitCode=0 Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.623375 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" event={"ID":"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56","Type":"ContainerDied","Data":"e8058b5cef581f6d4455bac18815254f0a779432955cce6011d2ddf95afa0eee"} Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.989254 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79"] Jan 23 11:01:31 crc kubenswrapper[4957]: E0123 11:01:31.989460 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerName="extract" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.989471 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerName="extract" Jan 23 11:01:31 crc kubenswrapper[4957]: E0123 11:01:31.989480 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerName="util" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.989486 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerName="util" Jan 23 11:01:31 crc kubenswrapper[4957]: E0123 11:01:31.989496 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerName="pull" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.989503 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerName="pull" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.989591 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="684bbccf-3dbd-475e-b034-dfb861ac18a0" containerName="extract" Jan 23 11:01:31 crc kubenswrapper[4957]: I0123 11:01:31.990235 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.001386 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79"] Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.050200 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.050272 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.050372 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqdw\" (UniqueName: \"kubernetes.io/projected/84a261d5-d82c-4cbe-9079-64c9380bcd44-kube-api-access-sjqdw\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.150936 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqdw\" (UniqueName: \"kubernetes.io/projected/84a261d5-d82c-4cbe-9079-64c9380bcd44-kube-api-access-sjqdw\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.150979 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.151026 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.151865 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.152012 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.168259 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqdw\" (UniqueName: \"kubernetes.io/projected/84a261d5-d82c-4cbe-9079-64c9380bcd44-kube-api-access-sjqdw\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.308031 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.518514 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79"] Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.630180 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" event={"ID":"84a261d5-d82c-4cbe-9079-64c9380bcd44","Type":"ContainerStarted","Data":"40df747b4fb9a09e680c930e52ba6e5da5e4d344491ac01d3e35fd02908371aa"} Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.807676 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq"] Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.809023 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.821836 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq"] Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.965090 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.965607 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:32 crc kubenswrapper[4957]: I0123 11:01:32.965705 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlm4\" (UniqueName: \"kubernetes.io/projected/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-kube-api-access-qdlm4\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.066648 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.066745 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.066778 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdlm4\" (UniqueName: \"kubernetes.io/projected/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-kube-api-access-qdlm4\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.067174 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.067331 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.086431 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdlm4\" (UniqueName: \"kubernetes.io/projected/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-kube-api-access-qdlm4\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.152118 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.351623 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq"] Jan 23 11:01:33 crc kubenswrapper[4957]: W0123 11:01:33.374488 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa105be_eb9e_4e68_8aa1_5d91e002ca3c.slice/crio-64bacc1af6e8a9039b00aa19930a855b81c6302fc3962969d9c98c6ac2987375 WatchSource:0}: Error finding container 64bacc1af6e8a9039b00aa19930a855b81c6302fc3962969d9c98c6ac2987375: Status 404 returned error can't find the container with id 64bacc1af6e8a9039b00aa19930a855b81c6302fc3962969d9c98c6ac2987375 Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.637335 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" event={"ID":"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c","Type":"ContainerStarted","Data":"64bacc1af6e8a9039b00aa19930a855b81c6302fc3962969d9c98c6ac2987375"} Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.638928 4957 generic.go:334] "Generic (PLEG): container finished" podID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerID="b0964aa16ecbbc7987ff1435432c9e9381ca5cdfb9bc4661e830041b353516f7" exitCode=0 Jan 23 11:01:33 crc kubenswrapper[4957]: I0123 11:01:33.638971 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" event={"ID":"84a261d5-d82c-4cbe-9079-64c9380bcd44","Type":"ContainerDied","Data":"b0964aa16ecbbc7987ff1435432c9e9381ca5cdfb9bc4661e830041b353516f7"} Jan 23 11:01:34 crc kubenswrapper[4957]: I0123 11:01:34.648760 4957 generic.go:334] "Generic (PLEG): container finished" podID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerID="e09930e197d07967c7763bc93188033eb422625036209c5d297f4395cb591b23" exitCode=0 Jan 23 11:01:34 crc kubenswrapper[4957]: I0123 11:01:34.648828 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" event={"ID":"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c","Type":"ContainerDied","Data":"e09930e197d07967c7763bc93188033eb422625036209c5d297f4395cb591b23"} Jan 23 11:01:36 crc kubenswrapper[4957]: I0123 11:01:36.666806 4957 generic.go:334] "Generic (PLEG): container finished" podID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerID="33d88c1749b0403bba8959e8bf09f72e47589c07c4bcf1120332ef9d78f3b7c3" exitCode=0 Jan 23 11:01:36 crc kubenswrapper[4957]: I0123 11:01:36.666909 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" event={"ID":"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56","Type":"ContainerDied","Data":"33d88c1749b0403bba8959e8bf09f72e47589c07c4bcf1120332ef9d78f3b7c3"} Jan 23 11:01:36 crc kubenswrapper[4957]: I0123 11:01:36.698255 4957 generic.go:334] "Generic (PLEG): container finished" podID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerID="af371e2317797218dbd552a09aaaab5f06611cb9f17327efa78b93bda9997d45" exitCode=0 Jan 23 11:01:36 crc kubenswrapper[4957]: I0123 11:01:36.698498 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" event={"ID":"84a261d5-d82c-4cbe-9079-64c9380bcd44","Type":"ContainerDied","Data":"af371e2317797218dbd552a09aaaab5f06611cb9f17327efa78b93bda9997d45"} Jan 23 11:01:37 crc kubenswrapper[4957]: I0123 11:01:37.706616 4957 generic.go:334] "Generic (PLEG): container finished" podID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerID="dc664924aa8b93a6f011206ee0d03e581b03187624c39617b885917c7f0058d3" exitCode=0 Jan 23 11:01:37 crc kubenswrapper[4957]: I0123 11:01:37.706674 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" event={"ID":"84a261d5-d82c-4cbe-9079-64c9380bcd44","Type":"ContainerDied","Data":"dc664924aa8b93a6f011206ee0d03e581b03187624c39617b885917c7f0058d3"} Jan 23 11:01:37 crc kubenswrapper[4957]: I0123 11:01:37.710117 4957 generic.go:334] "Generic (PLEG): container finished" podID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerID="991676326d4f073ae2e6d18806d6d67241a148e9401fae12de6a7a31258c7795" exitCode=0 Jan 23 11:01:37 crc kubenswrapper[4957]: I0123 11:01:37.710170 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" event={"ID":"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c","Type":"ContainerDied","Data":"991676326d4f073ae2e6d18806d6d67241a148e9401fae12de6a7a31258c7795"} Jan 23 11:01:37 crc kubenswrapper[4957]: I0123 11:01:37.713424 4957 generic.go:334] "Generic (PLEG): container finished" podID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerID="6635df52e133eeb31bbb75abe75e98a8fbbb21b2e5a353340e134ba8f1c6fff9" exitCode=0 Jan 23 11:01:37 crc kubenswrapper[4957]: I0123 11:01:37.713465 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" event={"ID":"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56","Type":"ContainerDied","Data":"6635df52e133eeb31bbb75abe75e98a8fbbb21b2e5a353340e134ba8f1c6fff9"} Jan 23 11:01:38 crc kubenswrapper[4957]: I0123 11:01:38.720189 4957 generic.go:334] "Generic (PLEG): container finished" podID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerID="098976541c05f2de55dad591ff4bdb79c43692c01105e6d19a04752f0aff0620" exitCode=0 Jan 23 11:01:38 crc kubenswrapper[4957]: I0123 11:01:38.720256 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" event={"ID":"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c","Type":"ContainerDied","Data":"098976541c05f2de55dad591ff4bdb79c43692c01105e6d19a04752f0aff0620"} Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.016148 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.065995 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.146033 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-bundle\") pod \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.146116 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcgpz\" (UniqueName: \"kubernetes.io/projected/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-kube-api-access-hcgpz\") pod \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.146183 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-util\") pod \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\" (UID: \"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56\") " Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.147644 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-bundle" (OuterVolumeSpecName: "bundle") pod "05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" (UID: "05351f1f-ef4e-4c7a-a093-3cec8b6f3f56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.162946 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-util" (OuterVolumeSpecName: "util") pod "05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" (UID: "05351f1f-ef4e-4c7a-a093-3cec8b6f3f56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.169430 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-kube-api-access-hcgpz" (OuterVolumeSpecName: "kube-api-access-hcgpz") pod "05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" (UID: "05351f1f-ef4e-4c7a-a093-3cec8b6f3f56"). InnerVolumeSpecName "kube-api-access-hcgpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.247190 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-util\") pod \"84a261d5-d82c-4cbe-9079-64c9380bcd44\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.247320 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-bundle\") pod \"84a261d5-d82c-4cbe-9079-64c9380bcd44\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.247349 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjqdw\" (UniqueName: \"kubernetes.io/projected/84a261d5-d82c-4cbe-9079-64c9380bcd44-kube-api-access-sjqdw\") pod \"84a261d5-d82c-4cbe-9079-64c9380bcd44\" (UID: \"84a261d5-d82c-4cbe-9079-64c9380bcd44\") " Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.247567 4957 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.247589 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcgpz\" (UniqueName: \"kubernetes.io/projected/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-kube-api-access-hcgpz\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.247612 4957 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05351f1f-ef4e-4c7a-a093-3cec8b6f3f56-util\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.248384 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-bundle" (OuterVolumeSpecName: "bundle") pod "84a261d5-d82c-4cbe-9079-64c9380bcd44" (UID: "84a261d5-d82c-4cbe-9079-64c9380bcd44"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.251368 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a261d5-d82c-4cbe-9079-64c9380bcd44-kube-api-access-sjqdw" (OuterVolumeSpecName: "kube-api-access-sjqdw") pod "84a261d5-d82c-4cbe-9079-64c9380bcd44" (UID: "84a261d5-d82c-4cbe-9079-64c9380bcd44"). InnerVolumeSpecName "kube-api-access-sjqdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.260403 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-util" (OuterVolumeSpecName: "util") pod "84a261d5-d82c-4cbe-9079-64c9380bcd44" (UID: "84a261d5-d82c-4cbe-9079-64c9380bcd44"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.348977 4957 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-util\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.349009 4957 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a261d5-d82c-4cbe-9079-64c9380bcd44-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.349018 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjqdw\" (UniqueName: \"kubernetes.io/projected/84a261d5-d82c-4cbe-9079-64c9380bcd44-kube-api-access-sjqdw\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.727945 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.727908 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd" event={"ID":"05351f1f-ef4e-4c7a-a093-3cec8b6f3f56","Type":"ContainerDied","Data":"100981d69e0c39a09682501aabdc74c92821f74c9d243abe96ccd0ff92d2103d"} Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.728424 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="100981d69e0c39a09682501aabdc74c92821f74c9d243abe96ccd0ff92d2103d" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.729657 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" event={"ID":"84a261d5-d82c-4cbe-9079-64c9380bcd44","Type":"ContainerDied","Data":"40df747b4fb9a09e680c930e52ba6e5da5e4d344491ac01d3e35fd02908371aa"} Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.729712 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40df747b4fb9a09e680c930e52ba6e5da5e4d344491ac01d3e35fd02908371aa" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.729832 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79" Jan 23 11:01:39 crc kubenswrapper[4957]: I0123 11:01:39.940077 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.060650 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-bundle\") pod \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.060795 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-util\") pod \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.060849 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdlm4\" (UniqueName: \"kubernetes.io/projected/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-kube-api-access-qdlm4\") pod \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\" (UID: \"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c\") " Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.061440 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-bundle" (OuterVolumeSpecName: "bundle") pod "bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" (UID: "bfa105be-eb9e-4e68-8aa1-5d91e002ca3c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.066406 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-kube-api-access-qdlm4" (OuterVolumeSpecName: "kube-api-access-qdlm4") pod "bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" (UID: "bfa105be-eb9e-4e68-8aa1-5d91e002ca3c"). InnerVolumeSpecName "kube-api-access-qdlm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.072602 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-util" (OuterVolumeSpecName: "util") pod "bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" (UID: "bfa105be-eb9e-4e68-8aa1-5d91e002ca3c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.162398 4957 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-util\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.162437 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdlm4\" (UniqueName: \"kubernetes.io/projected/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-kube-api-access-qdlm4\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.162452 4957 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfa105be-eb9e-4e68-8aa1-5d91e002ca3c-bundle\") on node \"crc\" DevicePath \"\"" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.736982 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" event={"ID":"bfa105be-eb9e-4e68-8aa1-5d91e002ca3c","Type":"ContainerDied","Data":"64bacc1af6e8a9039b00aa19930a855b81c6302fc3962969d9c98c6ac2987375"} Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.737031 4957 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64bacc1af6e8a9039b00aa19930a855b81c6302fc3962969d9c98c6ac2987375" Jan 23 11:01:40 crc kubenswrapper[4957]: I0123 11:01:40.737039 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.153702 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b"] Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154011 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerName="util" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154038 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerName="util" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154058 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerName="pull" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154070 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerName="pull" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154087 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerName="pull" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154100 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerName="pull" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154111 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerName="util" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154120 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerName="util" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154134 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154144 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154162 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerName="pull" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154170 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerName="pull" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154181 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerName="util" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154189 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerName="util" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154202 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154210 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: E0123 11:01:41.154223 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154231 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154395 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa105be-eb9e-4e68-8aa1-5d91e002ca3c" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154409 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a261d5-d82c-4cbe-9079-64c9380bcd44" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154422 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="05351f1f-ef4e-4c7a-a093-3cec8b6f3f56" containerName="extract" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.154865 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.156969 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-5xjxz" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.157363 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.157555 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.163801 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.264941 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.266142 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.270742 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.271406 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-mtvnh" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.274521 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.275099 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znxm\" (UniqueName: \"kubernetes.io/projected/36135589-afeb-4693-8919-41232243a809-kube-api-access-5znxm\") pod \"obo-prometheus-operator-68bc856cb9-dz78b\" (UID: \"36135589-afeb-4693-8919-41232243a809\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.282063 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.282824 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.289172 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.376698 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f84a2c7c-d1fc-43f2-8e87-30e51f367c23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q\" (UID: \"f84a2c7c-d1fc-43f2-8e87-30e51f367c23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.376768 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znxm\" (UniqueName: \"kubernetes.io/projected/36135589-afeb-4693-8919-41232243a809-kube-api-access-5znxm\") pod \"obo-prometheus-operator-68bc856cb9-dz78b\" (UID: \"36135589-afeb-4693-8919-41232243a809\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.376799 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5efeae78-a9ef-4b16-b3b9-f022a1ed43eb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr\" (UID: \"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.376840 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f84a2c7c-d1fc-43f2-8e87-30e51f367c23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q\" (UID: \"f84a2c7c-d1fc-43f2-8e87-30e51f367c23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.376869 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5efeae78-a9ef-4b16-b3b9-f022a1ed43eb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr\" (UID: \"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.406305 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znxm\" (UniqueName: \"kubernetes.io/projected/36135589-afeb-4693-8919-41232243a809-kube-api-access-5znxm\") pod \"obo-prometheus-operator-68bc856cb9-dz78b\" (UID: \"36135589-afeb-4693-8919-41232243a809\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.466246 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gl5gd"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.467154 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.469004 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.469633 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-lxcpf" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.472141 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.480607 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gl5gd"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.482029 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f84a2c7c-d1fc-43f2-8e87-30e51f367c23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q\" (UID: \"f84a2c7c-d1fc-43f2-8e87-30e51f367c23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.482088 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5efeae78-a9ef-4b16-b3b9-f022a1ed43eb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr\" (UID: \"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.482128 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f84a2c7c-d1fc-43f2-8e87-30e51f367c23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q\" (UID: \"f84a2c7c-d1fc-43f2-8e87-30e51f367c23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.482156 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5efeae78-a9ef-4b16-b3b9-f022a1ed43eb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr\" (UID: \"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.486042 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5efeae78-a9ef-4b16-b3b9-f022a1ed43eb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr\" (UID: \"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.486246 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f84a2c7c-d1fc-43f2-8e87-30e51f367c23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q\" (UID: \"f84a2c7c-d1fc-43f2-8e87-30e51f367c23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.487859 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5efeae78-a9ef-4b16-b3b9-f022a1ed43eb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr\" (UID: \"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.493659 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f84a2c7c-d1fc-43f2-8e87-30e51f367c23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q\" (UID: \"f84a2c7c-d1fc-43f2-8e87-30e51f367c23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.585844 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.586213 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7zz\" (UniqueName: \"kubernetes.io/projected/49486d1a-778e-4341-a8c7-a73763a29afc-kube-api-access-wp7zz\") pod \"observability-operator-59bdc8b94-gl5gd\" (UID: \"49486d1a-778e-4341-a8c7-a73763a29afc\") " pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.586271 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/49486d1a-778e-4341-a8c7-a73763a29afc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gl5gd\" (UID: \"49486d1a-778e-4341-a8c7-a73763a29afc\") " pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.600092 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.673825 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-75hhn"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.675632 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.687045 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7zz\" (UniqueName: \"kubernetes.io/projected/49486d1a-778e-4341-a8c7-a73763a29afc-kube-api-access-wp7zz\") pod \"observability-operator-59bdc8b94-gl5gd\" (UID: \"49486d1a-778e-4341-a8c7-a73763a29afc\") " pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.687374 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/49486d1a-778e-4341-a8c7-a73763a29afc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gl5gd\" (UID: \"49486d1a-778e-4341-a8c7-a73763a29afc\") " pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.688311 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qvdkw" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.695981 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/49486d1a-778e-4341-a8c7-a73763a29afc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gl5gd\" (UID: \"49486d1a-778e-4341-a8c7-a73763a29afc\") " pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.707265 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-75hhn"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.745987 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7zz\" (UniqueName: \"kubernetes.io/projected/49486d1a-778e-4341-a8c7-a73763a29afc-kube-api-access-wp7zz\") pod \"observability-operator-59bdc8b94-gl5gd\" (UID: \"49486d1a-778e-4341-a8c7-a73763a29afc\") " pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.788572 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg4q2\" (UniqueName: \"kubernetes.io/projected/f2407120-3100-4d21-bc9c-2f305d00fc37-kube-api-access-mg4q2\") pod \"perses-operator-5bf474d74f-75hhn\" (UID: \"f2407120-3100-4d21-bc9c-2f305d00fc37\") " pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.788629 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2407120-3100-4d21-bc9c-2f305d00fc37-openshift-service-ca\") pod \"perses-operator-5bf474d74f-75hhn\" (UID: \"f2407120-3100-4d21-bc9c-2f305d00fc37\") " pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.795484 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b"] Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.867920 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.889791 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg4q2\" (UniqueName: \"kubernetes.io/projected/f2407120-3100-4d21-bc9c-2f305d00fc37-kube-api-access-mg4q2\") pod \"perses-operator-5bf474d74f-75hhn\" (UID: \"f2407120-3100-4d21-bc9c-2f305d00fc37\") " pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.889863 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2407120-3100-4d21-bc9c-2f305d00fc37-openshift-service-ca\") pod \"perses-operator-5bf474d74f-75hhn\" (UID: \"f2407120-3100-4d21-bc9c-2f305d00fc37\") " pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.919346 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg4q2\" (UniqueName: \"kubernetes.io/projected/f2407120-3100-4d21-bc9c-2f305d00fc37-kube-api-access-mg4q2\") pod \"perses-operator-5bf474d74f-75hhn\" (UID: \"f2407120-3100-4d21-bc9c-2f305d00fc37\") " pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.921227 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2407120-3100-4d21-bc9c-2f305d00fc37-openshift-service-ca\") pod \"perses-operator-5bf474d74f-75hhn\" (UID: \"f2407120-3100-4d21-bc9c-2f305d00fc37\") " pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:41 crc kubenswrapper[4957]: I0123 11:01:41.942819 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr"] Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.058385 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.138817 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q"] Jan 23 11:01:42 crc kubenswrapper[4957]: W0123 11:01:42.165470 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84a2c7c_d1fc_43f2_8e87_30e51f367c23.slice/crio-fcfceaaebd95deea3b86ea808c217092dcb7e52c4bbca07ee253f979e45cdca4 WatchSource:0}: Error finding container fcfceaaebd95deea3b86ea808c217092dcb7e52c4bbca07ee253f979e45cdca4: Status 404 returned error can't find the container with id fcfceaaebd95deea3b86ea808c217092dcb7e52c4bbca07ee253f979e45cdca4 Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.165771 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gl5gd"] Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.279858 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-75hhn"] Jan 23 11:01:42 crc kubenswrapper[4957]: W0123 11:01:42.288815 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2407120_3100_4d21_bc9c_2f305d00fc37.slice/crio-c0ff4c2f124e17c0916cca3d9b71e9fcf948349d0d2169ac6d785981f8df810f WatchSource:0}: Error finding container c0ff4c2f124e17c0916cca3d9b71e9fcf948349d0d2169ac6d785981f8df810f: Status 404 returned error can't find the container with id c0ff4c2f124e17c0916cca3d9b71e9fcf948349d0d2169ac6d785981f8df810f Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.777978 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" event={"ID":"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb","Type":"ContainerStarted","Data":"88c188fa7bf55908b127c33d50d37bdacf03f9f6a21818635e42e68989cc4ee8"} Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.778379 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" event={"ID":"49486d1a-778e-4341-a8c7-a73763a29afc","Type":"ContainerStarted","Data":"1d3a3200806a35d7f91d08f52f26cde167e8e0206367945b57d0189c282bd565"} Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.778395 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" event={"ID":"36135589-afeb-4693-8919-41232243a809","Type":"ContainerStarted","Data":"f3e1f070a068fe6389cd77f5be2e0411c4dce2dd3d27604fd6aca38a7f4af4b8"} Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.778409 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-75hhn" event={"ID":"f2407120-3100-4d21-bc9c-2f305d00fc37","Type":"ContainerStarted","Data":"c0ff4c2f124e17c0916cca3d9b71e9fcf948349d0d2169ac6d785981f8df810f"} Jan 23 11:01:42 crc kubenswrapper[4957]: I0123 11:01:42.778424 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" event={"ID":"f84a2c7c-d1fc-43f2-8e87-30e51f367c23","Type":"ContainerStarted","Data":"fcfceaaebd95deea3b86ea808c217092dcb7e52c4bbca07ee253f979e45cdca4"} Jan 23 11:01:45 crc kubenswrapper[4957]: I0123 11:01:45.717384 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:01:45 crc kubenswrapper[4957]: I0123 11:01:45.717443 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.098198 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-7bbbd866bb-sqdzf"] Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.099464 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.102695 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.102886 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.103024 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.103153 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-k4t78" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.118890 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b86af1b-a255-4d4d-b789-9d70768ce58d-apiservice-cert\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.118939 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b86af1b-a255-4d4d-b789-9d70768ce58d-webhook-cert\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.119115 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzzw\" (UniqueName: \"kubernetes.io/projected/4b86af1b-a255-4d4d-b789-9d70768ce58d-kube-api-access-7vzzw\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.121828 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-7bbbd866bb-sqdzf"] Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.222076 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzzw\" (UniqueName: \"kubernetes.io/projected/4b86af1b-a255-4d4d-b789-9d70768ce58d-kube-api-access-7vzzw\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.222139 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b86af1b-a255-4d4d-b789-9d70768ce58d-apiservice-cert\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.222158 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b86af1b-a255-4d4d-b789-9d70768ce58d-webhook-cert\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.253057 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b86af1b-a255-4d4d-b789-9d70768ce58d-apiservice-cert\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.253695 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b86af1b-a255-4d4d-b789-9d70768ce58d-webhook-cert\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.257215 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzzw\" (UniqueName: \"kubernetes.io/projected/4b86af1b-a255-4d4d-b789-9d70768ce58d-kube-api-access-7vzzw\") pod \"elastic-operator-7bbbd866bb-sqdzf\" (UID: \"4b86af1b-a255-4d4d-b789-9d70768ce58d\") " pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:48 crc kubenswrapper[4957]: I0123 11:01:48.426079 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.001816 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-v464f"] Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.002752 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.004502 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-lgjwf" Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.049120 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-v464f"] Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.065519 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfzpc\" (UniqueName: \"kubernetes.io/projected/834fa095-b096-4231-b085-dbcfe6fac038-kube-api-access-rfzpc\") pod \"interconnect-operator-5bb49f789d-v464f\" (UID: \"834fa095-b096-4231-b085-dbcfe6fac038\") " pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.166774 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfzpc\" (UniqueName: \"kubernetes.io/projected/834fa095-b096-4231-b085-dbcfe6fac038-kube-api-access-rfzpc\") pod \"interconnect-operator-5bb49f789d-v464f\" (UID: \"834fa095-b096-4231-b085-dbcfe6fac038\") " pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.183979 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfzpc\" (UniqueName: \"kubernetes.io/projected/834fa095-b096-4231-b085-dbcfe6fac038-kube-api-access-rfzpc\") pod \"interconnect-operator-5bb49f789d-v464f\" (UID: \"834fa095-b096-4231-b085-dbcfe6fac038\") " pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" Jan 23 11:01:52 crc kubenswrapper[4957]: I0123 11:01:52.316088 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.532125 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-v464f"] Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.628151 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-7bbbd866bb-sqdzf"] Jan 23 11:01:55 crc kubenswrapper[4957]: W0123 11:01:55.631505 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b86af1b_a255_4d4d_b789_9d70768ce58d.slice/crio-7e45ec6901ea09d004f4367b002424fd0da9362221b206fff59ead36be2d5eab WatchSource:0}: Error finding container 7e45ec6901ea09d004f4367b002424fd0da9362221b206fff59ead36be2d5eab: Status 404 returned error can't find the container with id 7e45ec6901ea09d004f4367b002424fd0da9362221b206fff59ead36be2d5eab Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.865313 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" event={"ID":"49486d1a-778e-4341-a8c7-a73763a29afc","Type":"ContainerStarted","Data":"f03ad3d9e030598ecde2ac16bae3b04729f6b1f989ac254efc8b964b8851cd8c"} Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.866495 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.867694 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" event={"ID":"36135589-afeb-4693-8919-41232243a809","Type":"ContainerStarted","Data":"bc93a1d78b8637f26fea950679fbeca4801a6de78b1d338bbd16e5f539f017b1"} Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.870553 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-75hhn" event={"ID":"f2407120-3100-4d21-bc9c-2f305d00fc37","Type":"ContainerStarted","Data":"0d8ba14ce893a8ac0ce5202b156e6d89fb91df1633986b239dba0c4abd49862c"} Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.871020 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.872532 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" event={"ID":"f84a2c7c-d1fc-43f2-8e87-30e51f367c23","Type":"ContainerStarted","Data":"e3181d112d99a79b28a5a69034677e23343b5cc1a30750fb8942421947c55b80"} Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.874331 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" event={"ID":"834fa095-b096-4231-b085-dbcfe6fac038","Type":"ContainerStarted","Data":"656277f2e5a820550f183a7d170c0e9de0a801850313b9018f0dd776ebd8691b"} Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.875372 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" event={"ID":"4b86af1b-a255-4d4d-b789-9d70768ce58d","Type":"ContainerStarted","Data":"7e45ec6901ea09d004f4367b002424fd0da9362221b206fff59ead36be2d5eab"} Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.876950 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" event={"ID":"5efeae78-a9ef-4b16-b3b9-f022a1ed43eb","Type":"ContainerStarted","Data":"80fb3fd041fbbc24669654bbc614f9025edd373319828930f6d493f9eb71c313"} Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.884842 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.908206 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-gl5gd" podStartSLOduration=1.800013775 podStartE2EDuration="14.908185579s" podCreationTimestamp="2026-01-23 11:01:41 +0000 UTC" firstStartedPulling="2026-01-23 11:01:42.177566472 +0000 UTC m=+611.714819159" lastFinishedPulling="2026-01-23 11:01:55.285738276 +0000 UTC m=+624.822990963" observedRunningTime="2026-01-23 11:01:55.885259027 +0000 UTC m=+625.422511734" watchObservedRunningTime="2026-01-23 11:01:55.908185579 +0000 UTC m=+625.445438276" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.910731 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr" podStartSLOduration=1.6411305120000002 podStartE2EDuration="14.910720811s" podCreationTimestamp="2026-01-23 11:01:41 +0000 UTC" firstStartedPulling="2026-01-23 11:01:42.017602418 +0000 UTC m=+611.554855105" lastFinishedPulling="2026-01-23 11:01:55.287192717 +0000 UTC m=+624.824445404" observedRunningTime="2026-01-23 11:01:55.906580434 +0000 UTC m=+625.443833131" watchObservedRunningTime="2026-01-23 11:01:55.910720811 +0000 UTC m=+625.447973508" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.930593 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-75hhn" podStartSLOduration=1.93638355 podStartE2EDuration="14.930570875s" podCreationTimestamp="2026-01-23 11:01:41 +0000 UTC" firstStartedPulling="2026-01-23 11:01:42.291487679 +0000 UTC m=+611.828740366" lastFinishedPulling="2026-01-23 11:01:55.285675014 +0000 UTC m=+624.822927691" observedRunningTime="2026-01-23 11:01:55.928253579 +0000 UTC m=+625.465506266" watchObservedRunningTime="2026-01-23 11:01:55.930570875 +0000 UTC m=+625.467823562" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.950543 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-dz78b" podStartSLOduration=1.4538465010000001 podStartE2EDuration="14.950514051s" podCreationTimestamp="2026-01-23 11:01:41 +0000 UTC" firstStartedPulling="2026-01-23 11:01:41.788970533 +0000 UTC m=+611.326223220" lastFinishedPulling="2026-01-23 11:01:55.285638083 +0000 UTC m=+624.822890770" observedRunningTime="2026-01-23 11:01:55.94835059 +0000 UTC m=+625.485603277" watchObservedRunningTime="2026-01-23 11:01:55.950514051 +0000 UTC m=+625.487766738" Jan 23 11:01:55 crc kubenswrapper[4957]: I0123 11:01:55.972912 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q" podStartSLOduration=1.854437951 podStartE2EDuration="14.972878267s" podCreationTimestamp="2026-01-23 11:01:41 +0000 UTC" firstStartedPulling="2026-01-23 11:01:42.167771404 +0000 UTC m=+611.705024091" lastFinishedPulling="2026-01-23 11:01:55.28621172 +0000 UTC m=+624.823464407" observedRunningTime="2026-01-23 11:01:55.967855884 +0000 UTC m=+625.505108571" watchObservedRunningTime="2026-01-23 11:01:55.972878267 +0000 UTC m=+625.510130944" Jan 23 11:01:57 crc kubenswrapper[4957]: I0123 11:01:57.895422 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6"] Jan 23 11:01:57 crc kubenswrapper[4957]: I0123 11:01:57.896102 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:57 crc kubenswrapper[4957]: I0123 11:01:57.903125 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 23 11:01:57 crc kubenswrapper[4957]: I0123 11:01:57.903331 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 23 11:01:57 crc kubenswrapper[4957]: I0123 11:01:57.904194 4957 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-nklm7" Jan 23 11:01:57 crc kubenswrapper[4957]: I0123 11:01:57.950726 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6"] Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.045061 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6phhj\" (UniqueName: \"kubernetes.io/projected/e702864e-6944-4db0-be75-a846b989b105-kube-api-access-6phhj\") pod \"cert-manager-operator-controller-manager-5446d6888b-kw5w6\" (UID: \"e702864e-6944-4db0-be75-a846b989b105\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.045146 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e702864e-6944-4db0-be75-a846b989b105-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-kw5w6\" (UID: \"e702864e-6944-4db0-be75-a846b989b105\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.149987 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6phhj\" (UniqueName: \"kubernetes.io/projected/e702864e-6944-4db0-be75-a846b989b105-kube-api-access-6phhj\") pod \"cert-manager-operator-controller-manager-5446d6888b-kw5w6\" (UID: \"e702864e-6944-4db0-be75-a846b989b105\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.150103 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e702864e-6944-4db0-be75-a846b989b105-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-kw5w6\" (UID: \"e702864e-6944-4db0-be75-a846b989b105\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.150980 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e702864e-6944-4db0-be75-a846b989b105-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-kw5w6\" (UID: \"e702864e-6944-4db0-be75-a846b989b105\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.170854 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6phhj\" (UniqueName: \"kubernetes.io/projected/e702864e-6944-4db0-be75-a846b989b105-kube-api-access-6phhj\") pod \"cert-manager-operator-controller-manager-5446d6888b-kw5w6\" (UID: \"e702864e-6944-4db0-be75-a846b989b105\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.220479 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.900571 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" event={"ID":"4b86af1b-a255-4d4d-b789-9d70768ce58d","Type":"ContainerStarted","Data":"6da3225154ce99c821ce5453aa26f53deb42311370c48a0f8df79ef6a54e0e73"} Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.919514 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-7bbbd866bb-sqdzf" podStartSLOduration=7.944748529 podStartE2EDuration="10.919497136s" podCreationTimestamp="2026-01-23 11:01:48 +0000 UTC" firstStartedPulling="2026-01-23 11:01:55.6346999 +0000 UTC m=+625.171952587" lastFinishedPulling="2026-01-23 11:01:58.609448507 +0000 UTC m=+628.146701194" observedRunningTime="2026-01-23 11:01:58.918751104 +0000 UTC m=+628.456003791" watchObservedRunningTime="2026-01-23 11:01:58.919497136 +0000 UTC m=+628.456749823" Jan 23 11:01:58 crc kubenswrapper[4957]: I0123 11:01:58.981909 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6"] Jan 23 11:01:58 crc kubenswrapper[4957]: W0123 11:01:58.989747 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode702864e_6944_4db0_be75_a846b989b105.slice/crio-b95485db5ec56d2a88570c7d04577276c8bfff007d197b432b91541a02e4ca14 WatchSource:0}: Error finding container b95485db5ec56d2a88570c7d04577276c8bfff007d197b432b91541a02e4ca14: Status 404 returned error can't find the container with id b95485db5ec56d2a88570c7d04577276c8bfff007d197b432b91541a02e4ca14 Jan 23 11:01:59 crc kubenswrapper[4957]: I0123 11:01:59.912161 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" event={"ID":"e702864e-6944-4db0-be75-a846b989b105","Type":"ContainerStarted","Data":"b95485db5ec56d2a88570c7d04577276c8bfff007d197b432b91541a02e4ca14"} Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.063166 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-75hhn" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.463412 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.465022 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.466866 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.466961 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.467651 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.467931 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-9pt8n" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.470075 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.470333 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.470339 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.470417 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.470506 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.491874 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.607995 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608058 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608079 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608095 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608115 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608325 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608409 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608453 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608482 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608525 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608562 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608591 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608642 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608669 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.608702 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711076 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711156 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711192 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711222 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711259 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711322 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711371 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711420 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711463 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711514 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711546 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711578 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711629 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711665 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711702 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.711703 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.712727 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.712912 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.713148 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.713860 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.714751 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.714929 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.715103 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.721142 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.721152 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.721738 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.722170 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.725990 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.738551 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.739756 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/8128176f-ed6a-4d47-84dd-11b2ada7a5ca-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"8128176f-ed6a-4d47-84dd-11b2ada7a5ca\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:02 crc kubenswrapper[4957]: I0123 11:02:02.792054 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:03 crc kubenswrapper[4957]: I0123 11:02:03.180138 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 23 11:02:03 crc kubenswrapper[4957]: W0123 11:02:03.211460 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8128176f_ed6a_4d47_84dd_11b2ada7a5ca.slice/crio-806aca279b06fe03c3f1cea9f00c09992fb216d44085e561b6c1b5ffabb28d57 WatchSource:0}: Error finding container 806aca279b06fe03c3f1cea9f00c09992fb216d44085e561b6c1b5ffabb28d57: Status 404 returned error can't find the container with id 806aca279b06fe03c3f1cea9f00c09992fb216d44085e561b6c1b5ffabb28d57 Jan 23 11:02:03 crc kubenswrapper[4957]: I0123 11:02:03.932476 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" event={"ID":"834fa095-b096-4231-b085-dbcfe6fac038","Type":"ContainerStarted","Data":"70fabec3f9b66d7aaa1e5b1663f7a2dca7c6b36988d986787fcfa58bd48cfda2"} Jan 23 11:02:03 crc kubenswrapper[4957]: I0123 11:02:03.933639 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8128176f-ed6a-4d47-84dd-11b2ada7a5ca","Type":"ContainerStarted","Data":"806aca279b06fe03c3f1cea9f00c09992fb216d44085e561b6c1b5ffabb28d57"} Jan 23 11:02:03 crc kubenswrapper[4957]: I0123 11:02:03.951588 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-v464f" podStartSLOduration=5.419827825 podStartE2EDuration="12.951565189s" podCreationTimestamp="2026-01-23 11:01:51 +0000 UTC" firstStartedPulling="2026-01-23 11:01:55.542768348 +0000 UTC m=+625.080021035" lastFinishedPulling="2026-01-23 11:02:03.074505712 +0000 UTC m=+632.611758399" observedRunningTime="2026-01-23 11:02:03.947615737 +0000 UTC m=+633.484868434" watchObservedRunningTime="2026-01-23 11:02:03.951565189 +0000 UTC m=+633.488817886" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.036908 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.038445 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.040209 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.045186 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.045353 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-9pp79" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.045458 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.068541 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.132771 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td6vn\" (UniqueName: \"kubernetes.io/projected/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-kube-api-access-td6vn\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.132822 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.132862 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.132896 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.132924 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.132949 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.132981 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.133001 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.133027 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.133094 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.133180 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.133245 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234041 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td6vn\" (UniqueName: \"kubernetes.io/projected/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-kube-api-access-td6vn\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234099 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234133 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234165 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234203 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234229 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234261 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234301 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234326 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234350 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234385 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234418 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.234547 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.235208 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.235414 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.235485 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.235716 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.235826 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.236484 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.236825 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.236911 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.240045 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.240797 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.252241 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td6vn\" (UniqueName: \"kubernetes.io/projected/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-kube-api-access-td6vn\") pod \"service-telemetry-operator-1-build\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:11 crc kubenswrapper[4957]: I0123 11:02:11.356908 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:15 crc kubenswrapper[4957]: I0123 11:02:15.717054 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:02:15 crc kubenswrapper[4957]: I0123 11:02:15.717474 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:02:15 crc kubenswrapper[4957]: I0123 11:02:15.717610 4957 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 11:02:15 crc kubenswrapper[4957]: I0123 11:02:15.718499 4957 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"303ca83532cca99052083264393534af7ab68b89a646ac04970e5eb8fdb50844"} pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 23 11:02:15 crc kubenswrapper[4957]: I0123 11:02:15.718594 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" containerID="cri-o://303ca83532cca99052083264393534af7ab68b89a646ac04970e5eb8fdb50844" gracePeriod=600 Jan 23 11:02:19 crc kubenswrapper[4957]: I0123 11:02:19.049588 4957 generic.go:334] "Generic (PLEG): container finished" podID="224e3211-1f68-4673-8975-7e71b1e513d0" containerID="303ca83532cca99052083264393534af7ab68b89a646ac04970e5eb8fdb50844" exitCode=0 Jan 23 11:02:19 crc kubenswrapper[4957]: I0123 11:02:19.049660 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerDied","Data":"303ca83532cca99052083264393534af7ab68b89a646ac04970e5eb8fdb50844"} Jan 23 11:02:19 crc kubenswrapper[4957]: I0123 11:02:19.050042 4957 scope.go:117] "RemoveContainer" containerID="380c1560a9b50de2450e83ba786578aa9f30c79645bd086692c358cef7dcbaf6" Jan 23 11:02:21 crc kubenswrapper[4957]: I0123 11:02:21.309359 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 23 11:02:22 crc kubenswrapper[4957]: I0123 11:02:22.984890 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 23 11:02:22 crc kubenswrapper[4957]: I0123 11:02:22.986076 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:22 crc kubenswrapper[4957]: I0123 11:02:22.987864 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Jan 23 11:02:22 crc kubenswrapper[4957]: I0123 11:02:22.987895 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Jan 23 11:02:22 crc kubenswrapper[4957]: I0123 11:02:22.987907 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.016619 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.107740 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.107960 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chpzb\" (UniqueName: \"kubernetes.io/projected/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-kube-api-access-chpzb\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108111 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108157 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108227 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108349 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108414 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108470 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108531 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108576 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108635 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.108703 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.209965 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210065 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chpzb\" (UniqueName: \"kubernetes.io/projected/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-kube-api-access-chpzb\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210129 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210145 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210162 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210223 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210261 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210331 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210373 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210405 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210434 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210481 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.210511 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.211082 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.644830 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.645432 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.645577 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.645747 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.645904 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.646017 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.646467 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.646569 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.646800 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.647324 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chpzb\" (UniqueName: \"kubernetes.io/projected/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-kube-api-access-chpzb\") pod \"service-telemetry-operator-2-build\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:23 crc kubenswrapper[4957]: I0123 11:02:23.901080 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:28 crc kubenswrapper[4957]: I0123 11:02:28.461494 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 23 11:02:29 crc kubenswrapper[4957]: E0123 11:02:29.021580 4957 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Jan 23 11:02:29 crc kubenswrapper[4957]: E0123 11:02:29.022503 4957 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(8128176f-ed6a-4d47-84dd-11b2ada7a5ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 23 11:02:29 crc kubenswrapper[4957]: E0123 11:02:29.024157 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" Jan 23 11:02:29 crc kubenswrapper[4957]: I0123 11:02:29.113849 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 23 11:02:29 crc kubenswrapper[4957]: I0123 11:02:29.116888 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"16ac91441901705df207b992dbdbed5b8c06671e49d0ee3176372ea44a7ecdf1"} Jan 23 11:02:29 crc kubenswrapper[4957]: I0123 11:02:29.119913 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" event={"ID":"e702864e-6944-4db0-be75-a846b989b105","Type":"ContainerStarted","Data":"41d087d3ce330598ca65561c73080acb84d2a163a1adebcfdf529cd65b352b46"} Jan 23 11:02:29 crc kubenswrapper[4957]: I0123 11:02:29.121502 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3","Type":"ContainerStarted","Data":"39b4624736c4b94a349655b79e842f08df4c19cb181065121a1b243d3414a1d6"} Jan 23 11:02:29 crc kubenswrapper[4957]: W0123 11:02:29.122585 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a6a7ba7_e3b4_4251_91e1_3ca02ae7e0d5.slice/crio-2ccd61819f914a18a14592b6e949217d1451a0e49bafa4293dc94dc472c8ee54 WatchSource:0}: Error finding container 2ccd61819f914a18a14592b6e949217d1451a0e49bafa4293dc94dc472c8ee54: Status 404 returned error can't find the container with id 2ccd61819f914a18a14592b6e949217d1451a0e49bafa4293dc94dc472c8ee54 Jan 23 11:02:29 crc kubenswrapper[4957]: E0123 11:02:29.122733 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" Jan 23 11:02:29 crc kubenswrapper[4957]: I0123 11:02:29.256946 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-kw5w6" podStartSLOduration=2.6068842610000003 podStartE2EDuration="32.256931454s" podCreationTimestamp="2026-01-23 11:01:57 +0000 UTC" firstStartedPulling="2026-01-23 11:01:58.991708937 +0000 UTC m=+628.528961624" lastFinishedPulling="2026-01-23 11:02:28.64175612 +0000 UTC m=+658.179008817" observedRunningTime="2026-01-23 11:02:29.255618507 +0000 UTC m=+658.792871194" watchObservedRunningTime="2026-01-23 11:02:29.256931454 +0000 UTC m=+658.794184161" Jan 23 11:02:29 crc kubenswrapper[4957]: I0123 11:02:29.465215 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 23 11:02:29 crc kubenswrapper[4957]: I0123 11:02:29.500824 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 23 11:02:30 crc kubenswrapper[4957]: I0123 11:02:30.127467 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5","Type":"ContainerStarted","Data":"2ccd61819f914a18a14592b6e949217d1451a0e49bafa4293dc94dc472c8ee54"} Jan 23 11:02:30 crc kubenswrapper[4957]: E0123 11:02:30.129750 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" Jan 23 11:02:31 crc kubenswrapper[4957]: E0123 11:02:31.135503 4957 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.510795 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-ncfmh"] Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.512106 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.515175 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.515435 4957 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jkt5r" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.518722 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.528420 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-ncfmh"] Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.601313 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f0c5571-a692-4f49-8f15-074769b8c297-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-ncfmh\" (UID: \"4f0c5571-a692-4f49-8f15-074769b8c297\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.601381 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llht9\" (UniqueName: \"kubernetes.io/projected/4f0c5571-a692-4f49-8f15-074769b8c297-kube-api-access-llht9\") pod \"cert-manager-webhook-f4fb5df64-ncfmh\" (UID: \"4f0c5571-a692-4f49-8f15-074769b8c297\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.702735 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llht9\" (UniqueName: \"kubernetes.io/projected/4f0c5571-a692-4f49-8f15-074769b8c297-kube-api-access-llht9\") pod \"cert-manager-webhook-f4fb5df64-ncfmh\" (UID: \"4f0c5571-a692-4f49-8f15-074769b8c297\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.702849 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f0c5571-a692-4f49-8f15-074769b8c297-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-ncfmh\" (UID: \"4f0c5571-a692-4f49-8f15-074769b8c297\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.720461 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llht9\" (UniqueName: \"kubernetes.io/projected/4f0c5571-a692-4f49-8f15-074769b8c297-kube-api-access-llht9\") pod \"cert-manager-webhook-f4fb5df64-ncfmh\" (UID: \"4f0c5571-a692-4f49-8f15-074769b8c297\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.732648 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f0c5571-a692-4f49-8f15-074769b8c297-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-ncfmh\" (UID: \"4f0c5571-a692-4f49-8f15-074769b8c297\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.827629 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.910635 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-672j2"] Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.911834 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.914441 4957 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-nmc8q" Jan 23 11:02:33 crc kubenswrapper[4957]: I0123 11:02:33.928235 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-672j2"] Jan 23 11:02:34 crc kubenswrapper[4957]: I0123 11:02:34.108688 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-672j2\" (UID: \"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:34 crc kubenswrapper[4957]: I0123 11:02:34.108875 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxcfx\" (UniqueName: \"kubernetes.io/projected/c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f-kube-api-access-hxcfx\") pod \"cert-manager-cainjector-855d9ccff4-672j2\" (UID: \"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:34 crc kubenswrapper[4957]: I0123 11:02:34.210165 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxcfx\" (UniqueName: \"kubernetes.io/projected/c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f-kube-api-access-hxcfx\") pod \"cert-manager-cainjector-855d9ccff4-672j2\" (UID: \"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:34 crc kubenswrapper[4957]: I0123 11:02:34.210267 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-672j2\" (UID: \"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:34 crc kubenswrapper[4957]: I0123 11:02:34.228650 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-672j2\" (UID: \"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:34 crc kubenswrapper[4957]: I0123 11:02:34.237750 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxcfx\" (UniqueName: \"kubernetes.io/projected/c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f-kube-api-access-hxcfx\") pod \"cert-manager-cainjector-855d9ccff4-672j2\" (UID: \"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:34 crc kubenswrapper[4957]: I0123 11:02:34.527489 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" Jan 23 11:02:35 crc kubenswrapper[4957]: I0123 11:02:35.585640 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-672j2"] Jan 23 11:02:35 crc kubenswrapper[4957]: W0123 11:02:35.594561 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5bcc35d_cb6f_4f1f_9fa2_9b75b0ff008f.slice/crio-dfb5540fe4327c84d2b2bb7022eba87fa437a5310f6e9c9b203e715b8e3e91a1 WatchSource:0}: Error finding container dfb5540fe4327c84d2b2bb7022eba87fa437a5310f6e9c9b203e715b8e3e91a1: Status 404 returned error can't find the container with id dfb5540fe4327c84d2b2bb7022eba87fa437a5310f6e9c9b203e715b8e3e91a1 Jan 23 11:02:35 crc kubenswrapper[4957]: I0123 11:02:35.743088 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-ncfmh"] Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.166157 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5","Type":"ContainerStarted","Data":"d95ca712d6bc1ee4b94968f644fd271510646032dd8f3f29ba88faa75f5e3fb2"} Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.167756 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" event={"ID":"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f","Type":"ContainerStarted","Data":"dfb5540fe4327c84d2b2bb7022eba87fa437a5310f6e9c9b203e715b8e3e91a1"} Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.168943 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" event={"ID":"4f0c5571-a692-4f49-8f15-074769b8c297","Type":"ContainerStarted","Data":"0e7b8c2574848a85d0d15bb9c09340efab3740bfe6c3dd92048abe7fecd9a1f2"} Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.171877 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3","Type":"ContainerStarted","Data":"186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd"} Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.172076 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" containerName="manage-dockerfile" containerID="cri-o://186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd" gracePeriod=30 Jan 23 11:02:36 crc kubenswrapper[4957]: E0123 11:02:36.288168 4957 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2867283724404936083, SKID=, AKID=D1:AA:0A:78:C0:D9:52:38:C6:A3:64:4D:E2:34:7C:D4:DF:E9:F7:AF failed: x509: certificate signed by unknown authority" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.521083 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_5ff53e4f-d301-4c55-aa1b-4a80759dc4e3/manage-dockerfile/0.log" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.521604 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648183 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildcachedir\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648237 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td6vn\" (UniqueName: \"kubernetes.io/projected/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-kube-api-access-td6vn\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648261 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-root\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648292 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-blob-cache\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648337 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-pull\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648329 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648361 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-push\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648393 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-node-pullsecrets\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648429 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-ca-bundles\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648470 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildworkdir\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648502 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-system-configs\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648522 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-proxy-ca-bundles\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648536 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-run\") pod \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\" (UID: \"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3\") " Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648775 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648854 4957 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.648871 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.649005 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.649072 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.649261 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.649313 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.649570 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.649649 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.649656 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.668081 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-pull" (OuterVolumeSpecName: "builder-dockercfg-9pp79-pull") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "builder-dockercfg-9pp79-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.668097 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-push" (OuterVolumeSpecName: "builder-dockercfg-9pp79-push") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "builder-dockercfg-9pp79-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.669006 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-kube-api-access-td6vn" (OuterVolumeSpecName: "kube-api-access-td6vn") pod "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" (UID: "5ff53e4f-d301-4c55-aa1b-4a80759dc4e3"). InnerVolumeSpecName "kube-api-access-td6vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763080 4957 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763117 4957 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763126 4957 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763138 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763149 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td6vn\" (UniqueName: \"kubernetes.io/projected/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-kube-api-access-td6vn\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763158 4957 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763169 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-pull\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763178 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-builder-dockercfg-9pp79-push\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763186 4957 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:36 crc kubenswrapper[4957]: I0123 11:02:36.763194 4957 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.181303 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_5ff53e4f-d301-4c55-aa1b-4a80759dc4e3/manage-dockerfile/0.log" Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.181361 4957 generic.go:334] "Generic (PLEG): container finished" podID="5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" containerID="186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd" exitCode=1 Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.181417 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3","Type":"ContainerDied","Data":"186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd"} Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.181486 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"5ff53e4f-d301-4c55-aa1b-4a80759dc4e3","Type":"ContainerDied","Data":"39b4624736c4b94a349655b79e842f08df4c19cb181065121a1b243d3414a1d6"} Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.181507 4957 scope.go:117] "RemoveContainer" containerID="186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd" Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.181432 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.200799 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.208717 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.213252 4957 scope.go:117] "RemoveContainer" containerID="186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd" Jan 23 11:02:37 crc kubenswrapper[4957]: E0123 11:02:37.214185 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd\": container with ID starting with 186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd not found: ID does not exist" containerID="186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd" Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.214225 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd"} err="failed to get container status \"186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd\": rpc error: code = NotFound desc = could not find container \"186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd\": container with ID starting with 186577752a09243a4f129bff70d62926342e4ff0a3aca7c9eada0695b9ba12fd not found: ID does not exist" Jan 23 11:02:37 crc kubenswrapper[4957]: I0123 11:02:37.316329 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 23 11:02:38 crc kubenswrapper[4957]: I0123 11:02:38.189958 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-2-build" podUID="7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" containerName="git-clone" containerID="cri-o://d95ca712d6bc1ee4b94968f644fd271510646032dd8f3f29ba88faa75f5e3fb2" gracePeriod=30 Jan 23 11:02:38 crc kubenswrapper[4957]: I0123 11:02:38.778871 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" path="/var/lib/kubelet/pods/5ff53e4f-d301-4c55-aa1b-4a80759dc4e3/volumes" Jan 23 11:02:39 crc kubenswrapper[4957]: I0123 11:02:39.200807 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5/git-clone/0.log" Jan 23 11:02:39 crc kubenswrapper[4957]: I0123 11:02:39.201121 4957 generic.go:334] "Generic (PLEG): container finished" podID="7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" containerID="d95ca712d6bc1ee4b94968f644fd271510646032dd8f3f29ba88faa75f5e3fb2" exitCode=1 Jan 23 11:02:39 crc kubenswrapper[4957]: I0123 11:02:39.201163 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5","Type":"ContainerDied","Data":"d95ca712d6bc1ee4b94968f644fd271510646032dd8f3f29ba88faa75f5e3fb2"} Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.550579 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5/git-clone/0.log" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.551437 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.649796 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-blob-cache\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.649861 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-root\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.649932 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-node-pullsecrets\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.649957 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-ca-bundles\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650048 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650088 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildworkdir\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650573 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650593 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650564 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650671 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650698 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-run\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650747 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chpzb\" (UniqueName: \"kubernetes.io/projected/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-kube-api-access-chpzb\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.650930 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.651355 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-proxy-ca-bundles\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.651648 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.651704 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-push\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.651727 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-pull\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652156 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildcachedir\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652211 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-system-configs\") pod \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\" (UID: \"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5\") " Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652607 4957 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652626 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652637 4957 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652645 4957 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652654 4957 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652664 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652675 4957 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652736 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.652915 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.658292 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-pull" (OuterVolumeSpecName: "builder-dockercfg-9pp79-pull") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "builder-dockercfg-9pp79-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.663464 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-push" (OuterVolumeSpecName: "builder-dockercfg-9pp79-push") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "builder-dockercfg-9pp79-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.664602 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-kube-api-access-chpzb" (OuterVolumeSpecName: "kube-api-access-chpzb") pod "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" (UID: "7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5"). InnerVolumeSpecName "kube-api-access-chpzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.753412 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chpzb\" (UniqueName: \"kubernetes.io/projected/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-kube-api-access-chpzb\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.753454 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-push\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.753468 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-builder-dockercfg-9pp79-pull\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.753479 4957 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:42 crc kubenswrapper[4957]: I0123 11:02:42.753491 4957 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.236655 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5/git-clone/0.log" Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.236795 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.236799 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5","Type":"ContainerDied","Data":"2ccd61819f914a18a14592b6e949217d1451a0e49bafa4293dc94dc472c8ee54"} Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.236846 4957 scope.go:117] "RemoveContainer" containerID="d95ca712d6bc1ee4b94968f644fd271510646032dd8f3f29ba88faa75f5e3fb2" Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.239037 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" event={"ID":"c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f","Type":"ContainerStarted","Data":"5dc9eb8dedd74a6967dbcb139fa84e1bc2e277aff3f200c63f2a356bcb385e94"} Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.242878 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" event={"ID":"4f0c5571-a692-4f49-8f15-074769b8c297","Type":"ContainerStarted","Data":"a0fc720275999a87b14679c3c7462bcf2483199c589fdffd99fa99587ac85e6c"} Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.243089 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.263651 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-672j2" podStartSLOduration=3.478724766 podStartE2EDuration="10.263629774s" podCreationTimestamp="2026-01-23 11:02:33 +0000 UTC" firstStartedPulling="2026-01-23 11:02:35.597302839 +0000 UTC m=+665.134555526" lastFinishedPulling="2026-01-23 11:02:42.382207847 +0000 UTC m=+671.919460534" observedRunningTime="2026-01-23 11:02:43.260586807 +0000 UTC m=+672.797839534" watchObservedRunningTime="2026-01-23 11:02:43.263629774 +0000 UTC m=+672.800882461" Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.329173 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" podStartSLOduration=3.6875367949999998 podStartE2EDuration="10.329095075s" podCreationTimestamp="2026-01-23 11:02:33 +0000 UTC" firstStartedPulling="2026-01-23 11:02:35.758909536 +0000 UTC m=+665.296162223" lastFinishedPulling="2026-01-23 11:02:42.400467816 +0000 UTC m=+671.937720503" observedRunningTime="2026-01-23 11:02:43.323184128 +0000 UTC m=+672.860436825" watchObservedRunningTime="2026-01-23 11:02:43.329095075 +0000 UTC m=+672.866347782" Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.350336 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 23 11:02:43 crc kubenswrapper[4957]: I0123 11:02:43.355970 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 23 11:02:44 crc kubenswrapper[4957]: I0123 11:02:44.777726 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" path="/var/lib/kubelet/pods/7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5/volumes" Jan 23 11:02:45 crc kubenswrapper[4957]: I0123 11:02:45.259245 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8128176f-ed6a-4d47-84dd-11b2ada7a5ca","Type":"ContainerStarted","Data":"4bdc882cb6292860c05f8e060fb72a3090d593398e1836a86066ffd709275bda"} Jan 23 11:02:47 crc kubenswrapper[4957]: I0123 11:02:47.273228 4957 generic.go:334] "Generic (PLEG): container finished" podID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" containerID="4bdc882cb6292860c05f8e060fb72a3090d593398e1836a86066ffd709275bda" exitCode=0 Jan 23 11:02:47 crc kubenswrapper[4957]: I0123 11:02:47.273320 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8128176f-ed6a-4d47-84dd-11b2ada7a5ca","Type":"ContainerDied","Data":"4bdc882cb6292860c05f8e060fb72a3090d593398e1836a86066ffd709275bda"} Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.828441 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 23 11:02:48 crc kubenswrapper[4957]: E0123 11:02:48.828987 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" containerName="git-clone" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.829004 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" containerName="git-clone" Jan 23 11:02:48 crc kubenswrapper[4957]: E0123 11:02:48.829024 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" containerName="manage-dockerfile" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.829032 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" containerName="manage-dockerfile" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.829146 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff53e4f-d301-4c55-aa1b-4a80759dc4e3" containerName="manage-dockerfile" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.829165 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6a7ba7-e3b4-4251-91e1-3ca02ae7e0d5" containerName="git-clone" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.829889 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-ncfmh" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.830083 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.832159 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-sys-config" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.832766 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-ca" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.833333 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-global-ca" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.833528 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-9pp79" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.847547 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935393 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935436 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935501 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935531 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935561 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935581 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935601 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935768 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935819 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcnzl\" (UniqueName: \"kubernetes.io/projected/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-kube-api-access-pcnzl\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935843 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.935980 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:48 crc kubenswrapper[4957]: I0123 11:02:48.936021 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036803 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036847 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036864 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036886 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036903 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036927 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036946 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcnzl\" (UniqueName: \"kubernetes.io/projected/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-kube-api-access-pcnzl\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036967 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.036997 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.037014 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.037047 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.037064 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.037125 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.037457 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.037665 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.037829 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.038192 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.038380 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.038763 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.039183 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.039257 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.042575 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.043065 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.057224 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcnzl\" (UniqueName: \"kubernetes.io/projected/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-kube-api-access-pcnzl\") pod \"service-telemetry-operator-3-build\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.147193 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:02:49 crc kubenswrapper[4957]: I0123 11:02:49.390993 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 23 11:02:49 crc kubenswrapper[4957]: W0123 11:02:49.400449 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc208ad5f_d09b_4eb4_b0a0_bb8ab9eabc35.slice/crio-9930aec69cf54283a291b0621c5922fd4be35d73c9a2be234a2f8b14752e2520 WatchSource:0}: Error finding container 9930aec69cf54283a291b0621c5922fd4be35d73c9a2be234a2f8b14752e2520: Status 404 returned error can't find the container with id 9930aec69cf54283a291b0621c5922fd4be35d73c9a2be234a2f8b14752e2520 Jan 23 11:02:50 crc kubenswrapper[4957]: I0123 11:02:50.301509 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35","Type":"ContainerStarted","Data":"9930aec69cf54283a291b0621c5922fd4be35d73c9a2be234a2f8b14752e2520"} Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.402371 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-x5z94"] Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.407516 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.411015 4957 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dc9h5" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.421814 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-x5z94"] Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.491676 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/308add36-df2e-4461-a8ee-3a3a430b738a-bound-sa-token\") pod \"cert-manager-86cb77c54b-x5z94\" (UID: \"308add36-df2e-4461-a8ee-3a3a430b738a\") " pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.491744 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvmv\" (UniqueName: \"kubernetes.io/projected/308add36-df2e-4461-a8ee-3a3a430b738a-kube-api-access-bfvmv\") pod \"cert-manager-86cb77c54b-x5z94\" (UID: \"308add36-df2e-4461-a8ee-3a3a430b738a\") " pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.592904 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvmv\" (UniqueName: \"kubernetes.io/projected/308add36-df2e-4461-a8ee-3a3a430b738a-kube-api-access-bfvmv\") pod \"cert-manager-86cb77c54b-x5z94\" (UID: \"308add36-df2e-4461-a8ee-3a3a430b738a\") " pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.593194 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/308add36-df2e-4461-a8ee-3a3a430b738a-bound-sa-token\") pod \"cert-manager-86cb77c54b-x5z94\" (UID: \"308add36-df2e-4461-a8ee-3a3a430b738a\") " pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.614442 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/308add36-df2e-4461-a8ee-3a3a430b738a-bound-sa-token\") pod \"cert-manager-86cb77c54b-x5z94\" (UID: \"308add36-df2e-4461-a8ee-3a3a430b738a\") " pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.615525 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvmv\" (UniqueName: \"kubernetes.io/projected/308add36-df2e-4461-a8ee-3a3a430b738a-kube-api-access-bfvmv\") pod \"cert-manager-86cb77c54b-x5z94\" (UID: \"308add36-df2e-4461-a8ee-3a3a430b738a\") " pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:52 crc kubenswrapper[4957]: I0123 11:02:52.737810 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-x5z94" Jan 23 11:02:53 crc kubenswrapper[4957]: I0123 11:02:53.156697 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-x5z94"] Jan 23 11:02:53 crc kubenswrapper[4957]: W0123 11:02:53.165984 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod308add36_df2e_4461_a8ee_3a3a430b738a.slice/crio-94533f2035991ccf6c619394c9f378dd44c507e3693c946a4321aa4ff483158f WatchSource:0}: Error finding container 94533f2035991ccf6c619394c9f378dd44c507e3693c946a4321aa4ff483158f: Status 404 returned error can't find the container with id 94533f2035991ccf6c619394c9f378dd44c507e3693c946a4321aa4ff483158f Jan 23 11:02:53 crc kubenswrapper[4957]: I0123 11:02:53.322267 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-x5z94" event={"ID":"308add36-df2e-4461-a8ee-3a3a430b738a","Type":"ContainerStarted","Data":"94533f2035991ccf6c619394c9f378dd44c507e3693c946a4321aa4ff483158f"} Jan 23 11:02:54 crc kubenswrapper[4957]: I0123 11:02:54.905605 4957 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-5vdlt container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 23 11:02:54 crc kubenswrapper[4957]: I0123 11:02:54.905696 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5vdlt" podUID="a4d8bbc7-e221-4057-ae8c-c63d33b2e4f5" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 23 11:02:57 crc kubenswrapper[4957]: I0123 11:02:57.349431 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35","Type":"ContainerStarted","Data":"f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a"} Jan 23 11:02:57 crc kubenswrapper[4957]: I0123 11:02:57.351213 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-x5z94" event={"ID":"308add36-df2e-4461-a8ee-3a3a430b738a","Type":"ContainerStarted","Data":"6646df431b782515b591e2550855335e2c2451c72bc4742eb0d73b93e206cee3"} Jan 23 11:02:57 crc kubenswrapper[4957]: I0123 11:02:57.353088 4957 generic.go:334] "Generic (PLEG): container finished" podID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" containerID="1aac66b3f29d868d4f9de450af66c779c98d2b35378e0318c3fe1c98edc3978e" exitCode=0 Jan 23 11:02:57 crc kubenswrapper[4957]: I0123 11:02:57.353124 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8128176f-ed6a-4d47-84dd-11b2ada7a5ca","Type":"ContainerDied","Data":"1aac66b3f29d868d4f9de450af66c779c98d2b35378e0318c3fe1c98edc3978e"} Jan 23 11:02:57 crc kubenswrapper[4957]: E0123 11:02:57.415954 4957 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2867283724404936083, SKID=, AKID=D1:AA:0A:78:C0:D9:52:38:C6:A3:64:4D:E2:34:7C:D4:DF:E9:F7:AF failed: x509: certificate signed by unknown authority" Jan 23 11:02:57 crc kubenswrapper[4957]: I0123 11:02:57.450031 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-x5z94" podStartSLOduration=5.450012734 podStartE2EDuration="5.450012734s" podCreationTimestamp="2026-01-23 11:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 11:02:57.447466281 +0000 UTC m=+686.984719008" watchObservedRunningTime="2026-01-23 11:02:57.450012734 +0000 UTC m=+686.987265421" Jan 23 11:02:58 crc kubenswrapper[4957]: I0123 11:02:58.365311 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8128176f-ed6a-4d47-84dd-11b2ada7a5ca","Type":"ContainerStarted","Data":"3072ef1bd1c75d4467b28b2fbfdec2c929d6eaa89b53cbc9aa2c2fc876f529c9"} Jan 23 11:02:58 crc kubenswrapper[4957]: I0123 11:02:58.365940 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:02:58 crc kubenswrapper[4957]: I0123 11:02:58.410684 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=15.362310872 podStartE2EDuration="56.410660084s" podCreationTimestamp="2026-01-23 11:02:02 +0000 UTC" firstStartedPulling="2026-01-23 11:02:03.221737575 +0000 UTC m=+632.758990262" lastFinishedPulling="2026-01-23 11:02:44.270086747 +0000 UTC m=+673.807339474" observedRunningTime="2026-01-23 11:02:58.408590425 +0000 UTC m=+687.945843122" watchObservedRunningTime="2026-01-23 11:02:58.410660084 +0000 UTC m=+687.947912781" Jan 23 11:02:58 crc kubenswrapper[4957]: I0123 11:02:58.446789 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 23 11:02:59 crc kubenswrapper[4957]: I0123 11:02:59.371433 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-3-build" podUID="c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" containerName="git-clone" containerID="cri-o://f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a" gracePeriod=30 Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.127412 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35/git-clone/0.log" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.127731 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.219935 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-ca-bundles\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220034 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-root\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220077 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-run\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220100 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcnzl\" (UniqueName: \"kubernetes.io/projected/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-kube-api-access-pcnzl\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220145 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-pull\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220170 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-push\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220198 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-proxy-ca-bundles\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220224 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-system-configs\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220271 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildcachedir\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220335 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-blob-cache\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220377 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildworkdir\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220385 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220413 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-node-pullsecrets\") pod \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\" (UID: \"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35\") " Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220439 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220647 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220701 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220837 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220884 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220959 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220977 4957 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.220986 4957 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.221005 4957 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.221016 4957 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.221024 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.221042 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.221126 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.225701 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-pull" (OuterVolumeSpecName: "builder-dockercfg-9pp79-pull") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "builder-dockercfg-9pp79-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.226336 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-kube-api-access-pcnzl" (OuterVolumeSpecName: "kube-api-access-pcnzl") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "kube-api-access-pcnzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.240963 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-push" (OuterVolumeSpecName: "builder-dockercfg-9pp79-push") pod "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" (UID: "c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35"). InnerVolumeSpecName "builder-dockercfg-9pp79-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.322126 4957 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.322172 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.322191 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcnzl\" (UniqueName: \"kubernetes.io/projected/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-kube-api-access-pcnzl\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.322219 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-pull\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.322238 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-builder-dockercfg-9pp79-push\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.322254 4957 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.322269 4957 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.377532 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35/git-clone/0.log" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.377590 4957 generic.go:334] "Generic (PLEG): container finished" podID="c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" containerID="f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a" exitCode=1 Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.377616 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35","Type":"ContainerDied","Data":"f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a"} Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.377639 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35","Type":"ContainerDied","Data":"9930aec69cf54283a291b0621c5922fd4be35d73c9a2be234a2f8b14752e2520"} Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.377654 4957 scope.go:117] "RemoveContainer" containerID="f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.377717 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.405409 4957 scope.go:117] "RemoveContainer" containerID="f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a" Jan 23 11:03:00 crc kubenswrapper[4957]: E0123 11:03:00.407172 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a\": container with ID starting with f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a not found: ID does not exist" containerID="f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.407239 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a"} err="failed to get container status \"f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a\": rpc error: code = NotFound desc = could not find container \"f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a\": container with ID starting with f9a23561047ae63a2ddffdd9a5f51063229ef87859eddc4274dc6729ed130f1a not found: ID does not exist" Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.422887 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.433124 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 23 11:03:00 crc kubenswrapper[4957]: I0123 11:03:00.782883 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" path="/var/lib/kubelet/pods/c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35/volumes" Jan 23 11:03:07 crc kubenswrapper[4957]: I0123 11:03:07.889651 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" containerName="elasticsearch" probeResult="failure" output=< Jan 23 11:03:07 crc kubenswrapper[4957]: {"timestamp": "2026-01-23T11:03:07+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 23 11:03:07 crc kubenswrapper[4957]: > Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.859734 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 23 11:03:09 crc kubenswrapper[4957]: E0123 11:03:09.859989 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" containerName="git-clone" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.860006 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" containerName="git-clone" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.860123 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="c208ad5f-d09b-4eb4-b0a0-bb8ab9eabc35" containerName="git-clone" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.860982 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.863782 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-ca" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.864393 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-sys-config" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.865426 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-9pp79" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.868928 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-global-ca" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.881543 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953300 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953343 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953384 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953645 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953713 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953798 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953875 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953916 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953941 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.953967 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-745qf\" (UniqueName: \"kubernetes.io/projected/dbc410fb-3db7-430b-8bac-0f5c7398aed2-kube-api-access-745qf\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.954013 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:09 crc kubenswrapper[4957]: I0123 11:03:09.954076 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.055671 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056207 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056351 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056429 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056580 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056761 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056890 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057027 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057173 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056578 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057178 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.056769 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057321 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057481 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057530 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-745qf\" (UniqueName: \"kubernetes.io/projected/dbc410fb-3db7-430b-8bac-0f5c7398aed2-kube-api-access-745qf\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057545 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057586 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.057799 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.058373 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.058448 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.059004 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.061978 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.076378 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.084204 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-745qf\" (UniqueName: \"kubernetes.io/projected/dbc410fb-3db7-430b-8bac-0f5c7398aed2-kube-api-access-745qf\") pod \"service-telemetry-operator-4-build\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.182049 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.423148 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 23 11:03:10 crc kubenswrapper[4957]: I0123 11:03:10.436782 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"dbc410fb-3db7-430b-8bac-0f5c7398aed2","Type":"ContainerStarted","Data":"ab75c756cbd0bcc1ee77e746853e4cc662282cbb094fd18f08c8d7ec3acd97ad"} Jan 23 11:03:12 crc kubenswrapper[4957]: I0123 11:03:12.871032 4957 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="8128176f-ed6a-4d47-84dd-11b2ada7a5ca" containerName="elasticsearch" probeResult="failure" output=< Jan 23 11:03:12 crc kubenswrapper[4957]: {"timestamp": "2026-01-23T11:03:12+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 23 11:03:12 crc kubenswrapper[4957]: > Jan 23 11:03:13 crc kubenswrapper[4957]: I0123 11:03:13.461088 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"dbc410fb-3db7-430b-8bac-0f5c7398aed2","Type":"ContainerStarted","Data":"acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7"} Jan 23 11:03:13 crc kubenswrapper[4957]: E0123 11:03:13.539005 4957 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2867283724404936083, SKID=, AKID=D1:AA:0A:78:C0:D9:52:38:C6:A3:64:4D:E2:34:7C:D4:DF:E9:F7:AF failed: x509: certificate signed by unknown authority" Jan 23 11:03:14 crc kubenswrapper[4957]: I0123 11:03:14.563670 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 23 11:03:15 crc kubenswrapper[4957]: I0123 11:03:15.479859 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-4-build" podUID="dbc410fb-3db7-430b-8bac-0f5c7398aed2" containerName="git-clone" containerID="cri-o://acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7" gracePeriod=30 Jan 23 11:03:16 crc kubenswrapper[4957]: I0123 11:03:16.969738 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_dbc410fb-3db7-430b-8bac-0f5c7398aed2/git-clone/0.log" Jan 23 11:03:16 crc kubenswrapper[4957]: I0123 11:03:16.970086 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163466 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-run\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163514 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-proxy-ca-bundles\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163539 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-node-pullsecrets\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163577 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-push\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163602 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-pull\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163633 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildworkdir\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163674 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildcachedir\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163701 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-system-configs\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163735 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-ca-bundles\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163756 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-root\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163772 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-blob-cache\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163793 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-745qf\" (UniqueName: \"kubernetes.io/projected/dbc410fb-3db7-430b-8bac-0f5c7398aed2-kube-api-access-745qf\") pod \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\" (UID: \"dbc410fb-3db7-430b-8bac-0f5c7398aed2\") " Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163861 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.163923 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164026 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164039 4957 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164355 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164395 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164412 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164425 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164574 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164596 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.164678 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.168460 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-push" (OuterVolumeSpecName: "builder-dockercfg-9pp79-push") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "builder-dockercfg-9pp79-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.171834 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-pull" (OuterVolumeSpecName: "builder-dockercfg-9pp79-pull") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "builder-dockercfg-9pp79-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.173398 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc410fb-3db7-430b-8bac-0f5c7398aed2-kube-api-access-745qf" (OuterVolumeSpecName: "kube-api-access-745qf") pod "dbc410fb-3db7-430b-8bac-0f5c7398aed2" (UID: "dbc410fb-3db7-430b-8bac-0f5c7398aed2"). InnerVolumeSpecName "kube-api-access-745qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265810 4957 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265878 4957 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265893 4957 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265922 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265933 4957 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265943 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-745qf\" (UniqueName: \"kubernetes.io/projected/dbc410fb-3db7-430b-8bac-0f5c7398aed2-kube-api-access-745qf\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265952 4957 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbc410fb-3db7-430b-8bac-0f5c7398aed2-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265961 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-push\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.265972 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/dbc410fb-3db7-430b-8bac-0f5c7398aed2-builder-dockercfg-9pp79-pull\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.266004 4957 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dbc410fb-3db7-430b-8bac-0f5c7398aed2-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.493559 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_dbc410fb-3db7-430b-8bac-0f5c7398aed2/git-clone/0.log" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.493617 4957 generic.go:334] "Generic (PLEG): container finished" podID="dbc410fb-3db7-430b-8bac-0f5c7398aed2" containerID="acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7" exitCode=1 Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.493654 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"dbc410fb-3db7-430b-8bac-0f5c7398aed2","Type":"ContainerDied","Data":"acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7"} Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.493681 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"dbc410fb-3db7-430b-8bac-0f5c7398aed2","Type":"ContainerDied","Data":"ab75c756cbd0bcc1ee77e746853e4cc662282cbb094fd18f08c8d7ec3acd97ad"} Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.493697 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.493700 4957 scope.go:117] "RemoveContainer" containerID="acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.514342 4957 scope.go:117] "RemoveContainer" containerID="acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7" Jan 23 11:03:17 crc kubenswrapper[4957]: E0123 11:03:17.514836 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7\": container with ID starting with acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7 not found: ID does not exist" containerID="acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.514884 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7"} err="failed to get container status \"acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7\": rpc error: code = NotFound desc = could not find container \"acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7\": container with ID starting with acb2e304bc12d969916583746df2a0a92bf189c07375155f15e36fb7e533d5e7 not found: ID does not exist" Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.530655 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 23 11:03:17 crc kubenswrapper[4957]: I0123 11:03:17.540910 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 23 11:03:18 crc kubenswrapper[4957]: I0123 11:03:18.754660 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Jan 23 11:03:18 crc kubenswrapper[4957]: I0123 11:03:18.779033 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc410fb-3db7-430b-8bac-0f5c7398aed2" path="/var/lib/kubelet/pods/dbc410fb-3db7-430b-8bac-0f5c7398aed2/volumes" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.056150 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 23 11:03:26 crc kubenswrapper[4957]: E0123 11:03:26.058396 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc410fb-3db7-430b-8bac-0f5c7398aed2" containerName="git-clone" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.058563 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc410fb-3db7-430b-8bac-0f5c7398aed2" containerName="git-clone" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.059256 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc410fb-3db7-430b-8bac-0f5c7398aed2" containerName="git-clone" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.070304 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.076822 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-global-ca" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.077328 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-ca" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.077842 4957 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-9pp79" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.079017 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-sys-config" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.089381 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180356 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180432 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180467 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180490 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180510 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180547 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180583 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180640 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7tv\" (UniqueName: \"kubernetes.io/projected/3f618edb-3184-469a-ac70-92ea11b9eae4-kube-api-access-2w7tv\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180665 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180716 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180748 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.180813 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.281944 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282023 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282073 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282131 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282173 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282203 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282236 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282255 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282263 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282345 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282362 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282443 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7tv\" (UniqueName: \"kubernetes.io/projected/3f618edb-3184-469a-ac70-92ea11b9eae4-kube-api-access-2w7tv\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282476 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282517 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.282638 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.283357 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.283409 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.283615 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.283798 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.284036 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.284178 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.288883 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.290932 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-push\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.306503 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7tv\" (UniqueName: \"kubernetes.io/projected/3f618edb-3184-469a-ac70-92ea11b9eae4-kube-api-access-2w7tv\") pod \"service-telemetry-operator-5-build\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.396150 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:26 crc kubenswrapper[4957]: I0123 11:03:26.594432 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 23 11:03:27 crc kubenswrapper[4957]: I0123 11:03:27.561863 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"3f618edb-3184-469a-ac70-92ea11b9eae4","Type":"ContainerStarted","Data":"89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4"} Jan 23 11:03:27 crc kubenswrapper[4957]: I0123 11:03:27.562147 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"3f618edb-3184-469a-ac70-92ea11b9eae4","Type":"ContainerStarted","Data":"5720d790b1696c04e1a9deda7f36f6c3a6b27cc8cf713c21b3370af14c68204d"} Jan 23 11:03:27 crc kubenswrapper[4957]: E0123 11:03:27.626379 4957 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2867283724404936083, SKID=, AKID=D1:AA:0A:78:C0:D9:52:38:C6:A3:64:4D:E2:34:7C:D4:DF:E9:F7:AF failed: x509: certificate signed by unknown authority" Jan 23 11:03:28 crc kubenswrapper[4957]: I0123 11:03:28.649494 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 23 11:03:28 crc kubenswrapper[4957]: I0123 11:03:28.738236 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-z8lhm" podUID="d140b4dc-6d8e-4940-9a60-aa98665ac1b2" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 23 11:03:29 crc kubenswrapper[4957]: I0123 11:03:29.572250 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-5-build" podUID="3f618edb-3184-469a-ac70-92ea11b9eae4" containerName="git-clone" containerID="cri-o://89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4" gracePeriod=30 Jan 23 11:03:29 crc kubenswrapper[4957]: I0123 11:03:29.935657 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_3f618edb-3184-469a-ac70-92ea11b9eae4/git-clone/0.log" Jan 23 11:03:29 crc kubenswrapper[4957]: I0123 11:03:29.935964 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037064 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-root\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037143 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-build-blob-cache\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037195 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-system-configs\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037244 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-buildcachedir\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037320 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-node-pullsecrets\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037350 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037365 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-buildworkdir\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037417 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-pull\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037457 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-push\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037455 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037513 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-run\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037552 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-proxy-ca-bundles\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037599 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w7tv\" (UniqueName: \"kubernetes.io/projected/3f618edb-3184-469a-ac70-92ea11b9eae4-kube-api-access-2w7tv\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037641 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-ca-bundles\") pod \"3f618edb-3184-469a-ac70-92ea11b9eae4\" (UID: \"3f618edb-3184-469a-ac70-92ea11b9eae4\") " Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037715 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037830 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037834 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037956 4957 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037972 4957 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.037988 4957 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.038080 4957 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3f618edb-3184-469a-ac70-92ea11b9eae4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.038092 4957 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.038233 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.038381 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.038661 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.038793 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.043789 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-push" (OuterVolumeSpecName: "builder-dockercfg-9pp79-push") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "builder-dockercfg-9pp79-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.044709 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f618edb-3184-469a-ac70-92ea11b9eae4-kube-api-access-2w7tv" (OuterVolumeSpecName: "kube-api-access-2w7tv") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "kube-api-access-2w7tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.044793 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-pull" (OuterVolumeSpecName: "builder-dockercfg-9pp79-pull") pod "3f618edb-3184-469a-ac70-92ea11b9eae4" (UID: "3f618edb-3184-469a-ac70-92ea11b9eae4"). InnerVolumeSpecName "builder-dockercfg-9pp79-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.139659 4957 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.139704 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.139720 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-pull\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-pull\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.139732 4957 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-9pp79-push\" (UniqueName: \"kubernetes.io/secret/3f618edb-3184-469a-ac70-92ea11b9eae4-builder-dockercfg-9pp79-push\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.139746 4957 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3f618edb-3184-469a-ac70-92ea11b9eae4-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.139757 4957 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f618edb-3184-469a-ac70-92ea11b9eae4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.139769 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w7tv\" (UniqueName: \"kubernetes.io/projected/3f618edb-3184-469a-ac70-92ea11b9eae4-kube-api-access-2w7tv\") on node \"crc\" DevicePath \"\"" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.580392 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_3f618edb-3184-469a-ac70-92ea11b9eae4/git-clone/0.log" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.580439 4957 generic.go:334] "Generic (PLEG): container finished" podID="3f618edb-3184-469a-ac70-92ea11b9eae4" containerID="89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4" exitCode=1 Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.580472 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"3f618edb-3184-469a-ac70-92ea11b9eae4","Type":"ContainerDied","Data":"89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4"} Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.580501 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"3f618edb-3184-469a-ac70-92ea11b9eae4","Type":"ContainerDied","Data":"5720d790b1696c04e1a9deda7f36f6c3a6b27cc8cf713c21b3370af14c68204d"} Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.580521 4957 scope.go:117] "RemoveContainer" containerID="89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.580639 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.602943 4957 scope.go:117] "RemoveContainer" containerID="89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4" Jan 23 11:03:30 crc kubenswrapper[4957]: E0123 11:03:30.603433 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4\": container with ID starting with 89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4 not found: ID does not exist" containerID="89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.603511 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4"} err="failed to get container status \"89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4\": rpc error: code = NotFound desc = could not find container \"89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4\": container with ID starting with 89458461a7a29d04f7ffc591775b96f396f60df7e52c722c60b76c69250d5ff4 not found: ID does not exist" Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.621751 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.630884 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 23 11:03:30 crc kubenswrapper[4957]: I0123 11:03:30.778220 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f618edb-3184-469a-ac70-92ea11b9eae4" path="/var/lib/kubelet/pods/3f618edb-3184-469a-ac70-92ea11b9eae4/volumes" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.468948 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t27bc/must-gather-72xck"] Jan 23 11:04:04 crc kubenswrapper[4957]: E0123 11:04:04.469842 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f618edb-3184-469a-ac70-92ea11b9eae4" containerName="git-clone" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.469856 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f618edb-3184-469a-ac70-92ea11b9eae4" containerName="git-clone" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.469971 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f618edb-3184-469a-ac70-92ea11b9eae4" containerName="git-clone" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.470739 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.475427 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t27bc"/"openshift-service-ca.crt" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.475522 4957 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t27bc"/"kube-root-ca.crt" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.475743 4957 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t27bc"/"default-dockercfg-zzrf2" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.485037 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t27bc/must-gather-72xck"] Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.599858 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2w7\" (UniqueName: \"kubernetes.io/projected/a22690bd-3600-4e3f-bb91-9194a1b0e425-kube-api-access-xn2w7\") pod \"must-gather-72xck\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.599927 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a22690bd-3600-4e3f-bb91-9194a1b0e425-must-gather-output\") pod \"must-gather-72xck\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.700695 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a22690bd-3600-4e3f-bb91-9194a1b0e425-must-gather-output\") pod \"must-gather-72xck\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.700817 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2w7\" (UniqueName: \"kubernetes.io/projected/a22690bd-3600-4e3f-bb91-9194a1b0e425-kube-api-access-xn2w7\") pod \"must-gather-72xck\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.701217 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a22690bd-3600-4e3f-bb91-9194a1b0e425-must-gather-output\") pod \"must-gather-72xck\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.716725 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2w7\" (UniqueName: \"kubernetes.io/projected/a22690bd-3600-4e3f-bb91-9194a1b0e425-kube-api-access-xn2w7\") pod \"must-gather-72xck\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:04 crc kubenswrapper[4957]: I0123 11:04:04.785246 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:04:05 crc kubenswrapper[4957]: I0123 11:04:05.194013 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t27bc/must-gather-72xck"] Jan 23 11:04:05 crc kubenswrapper[4957]: I0123 11:04:05.814601 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t27bc/must-gather-72xck" event={"ID":"a22690bd-3600-4e3f-bb91-9194a1b0e425","Type":"ContainerStarted","Data":"63d8c22a07d2679b9c10d2874571cadaa2babe6a8651bf86b0609724b5873f68"} Jan 23 11:04:11 crc kubenswrapper[4957]: I0123 11:04:11.998978 4957 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 23 11:04:12 crc kubenswrapper[4957]: I0123 11:04:12.862249 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t27bc/must-gather-72xck" event={"ID":"a22690bd-3600-4e3f-bb91-9194a1b0e425","Type":"ContainerStarted","Data":"fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d"} Jan 23 11:04:12 crc kubenswrapper[4957]: I0123 11:04:12.862563 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t27bc/must-gather-72xck" event={"ID":"a22690bd-3600-4e3f-bb91-9194a1b0e425","Type":"ContainerStarted","Data":"b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986"} Jan 23 11:04:12 crc kubenswrapper[4957]: I0123 11:04:12.877576 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t27bc/must-gather-72xck" podStartSLOduration=1.990607958 podStartE2EDuration="8.877560628s" podCreationTimestamp="2026-01-23 11:04:04 +0000 UTC" firstStartedPulling="2026-01-23 11:04:05.200421218 +0000 UTC m=+754.737673915" lastFinishedPulling="2026-01-23 11:04:12.087373898 +0000 UTC m=+761.624626585" observedRunningTime="2026-01-23 11:04:12.877337762 +0000 UTC m=+762.414590449" watchObservedRunningTime="2026-01-23 11:04:12.877560628 +0000 UTC m=+762.414813315" Jan 23 11:04:45 crc kubenswrapper[4957]: I0123 11:04:45.717692 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:04:45 crc kubenswrapper[4957]: I0123 11:04:45.718151 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:04:51 crc kubenswrapper[4957]: I0123 11:04:51.769790 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xs8l7_04b86548-903a-4a0b-bb5f-c9cd297a9047/control-plane-machine-set-operator/0.log" Jan 23 11:04:51 crc kubenswrapper[4957]: I0123 11:04:51.904778 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lrnc4_4238c53f-acdc-409f-8d1d-e4608fe5c239/kube-rbac-proxy/0.log" Jan 23 11:04:51 crc kubenswrapper[4957]: I0123 11:04:51.931653 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lrnc4_4238c53f-acdc-409f-8d1d-e4608fe5c239/machine-api-operator/0.log" Jan 23 11:05:03 crc kubenswrapper[4957]: I0123 11:05:03.498507 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-x5z94_308add36-df2e-4461-a8ee-3a3a430b738a/cert-manager-controller/0.log" Jan 23 11:05:03 crc kubenswrapper[4957]: I0123 11:05:03.659899 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-672j2_c5bcc35d-cb6f-4f1f-9fa2-9b75b0ff008f/cert-manager-cainjector/0.log" Jan 23 11:05:03 crc kubenswrapper[4957]: I0123 11:05:03.696658 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-ncfmh_4f0c5571-a692-4f49-8f15-074769b8c297/cert-manager-webhook/0.log" Jan 23 11:05:15 crc kubenswrapper[4957]: I0123 11:05:15.716946 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:05:15 crc kubenswrapper[4957]: I0123 11:05:15.717535 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:05:16 crc kubenswrapper[4957]: I0123 11:05:16.410693 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-dz78b_36135589-afeb-4693-8919-41232243a809/prometheus-operator/0.log" Jan 23 11:05:16 crc kubenswrapper[4957]: I0123 11:05:16.551916 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr_5efeae78-a9ef-4b16-b3b9-f022a1ed43eb/prometheus-operator-admission-webhook/0.log" Jan 23 11:05:16 crc kubenswrapper[4957]: I0123 11:05:16.584211 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q_f84a2c7c-d1fc-43f2-8e87-30e51f367c23/prometheus-operator-admission-webhook/0.log" Jan 23 11:05:16 crc kubenswrapper[4957]: I0123 11:05:16.706769 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gl5gd_49486d1a-778e-4341-a8c7-a73763a29afc/operator/0.log" Jan 23 11:05:16 crc kubenswrapper[4957]: I0123 11:05:16.740625 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-75hhn_f2407120-3100-4d21-bc9c-2f305d00fc37/perses-operator/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.016231 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd_05351f1f-ef4e-4c7a-a093-3cec8b6f3f56/util/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.215089 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd_05351f1f-ef4e-4c7a-a093-3cec8b6f3f56/util/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.241965 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd_05351f1f-ef4e-4c7a-a093-3cec8b6f3f56/pull/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.246542 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd_05351f1f-ef4e-4c7a-a093-3cec8b6f3f56/pull/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.388428 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd_05351f1f-ef4e-4c7a-a093-3cec8b6f3f56/util/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.427150 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd_05351f1f-ef4e-4c7a-a093-3cec8b6f3f56/extract/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.473089 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a7l6jd_05351f1f-ef4e-4c7a-a093-3cec8b6f3f56/pull/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.574159 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79_84a261d5-d82c-4cbe-9079-64c9380bcd44/util/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.732985 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79_84a261d5-d82c-4cbe-9079-64c9380bcd44/util/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.733594 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79_84a261d5-d82c-4cbe-9079-64c9380bcd44/pull/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.739086 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79_84a261d5-d82c-4cbe-9079-64c9380bcd44/pull/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.898319 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79_84a261d5-d82c-4cbe-9079-64c9380bcd44/util/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.927671 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79_84a261d5-d82c-4cbe-9079-64c9380bcd44/pull/0.log" Jan 23 11:05:30 crc kubenswrapper[4957]: I0123 11:05:30.954820 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhl79_84a261d5-d82c-4cbe-9079-64c9380bcd44/extract/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.088028 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq_bfa105be-eb9e-4e68-8aa1-5d91e002ca3c/util/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.237533 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq_bfa105be-eb9e-4e68-8aa1-5d91e002ca3c/util/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.248886 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq_bfa105be-eb9e-4e68-8aa1-5d91e002ca3c/pull/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.283461 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq_bfa105be-eb9e-4e68-8aa1-5d91e002ca3c/pull/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.440813 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq_bfa105be-eb9e-4e68-8aa1-5d91e002ca3c/util/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.475393 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq_bfa105be-eb9e-4e68-8aa1-5d91e002ca3c/extract/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.497187 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5emt7dq_bfa105be-eb9e-4e68-8aa1-5d91e002ca3c/pull/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.645369 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7_684bbccf-3dbd-475e-b034-dfb861ac18a0/util/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.762903 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7_684bbccf-3dbd-475e-b034-dfb861ac18a0/util/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.802100 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7_684bbccf-3dbd-475e-b034-dfb861ac18a0/pull/0.log" Jan 23 11:05:31 crc kubenswrapper[4957]: I0123 11:05:31.849393 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7_684bbccf-3dbd-475e-b034-dfb861ac18a0/pull/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.119388 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7_684bbccf-3dbd-475e-b034-dfb861ac18a0/util/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.142800 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7_684bbccf-3dbd-475e-b034-dfb861ac18a0/extract/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.153619 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cz5f7_684bbccf-3dbd-475e-b034-dfb861ac18a0/pull/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.288733 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bsgcz_cd776179-759c-4fd9-987a-445a1d516d4c/extract-utilities/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.435375 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bsgcz_cd776179-759c-4fd9-987a-445a1d516d4c/extract-content/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.445580 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bsgcz_cd776179-759c-4fd9-987a-445a1d516d4c/extract-utilities/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.479989 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bsgcz_cd776179-759c-4fd9-987a-445a1d516d4c/extract-content/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.599830 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bsgcz_cd776179-759c-4fd9-987a-445a1d516d4c/extract-utilities/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.621219 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bsgcz_cd776179-759c-4fd9-987a-445a1d516d4c/extract-content/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.777660 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bsgcz_cd776179-759c-4fd9-987a-445a1d516d4c/registry-server/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.838908 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p4vh_d9dc559e-59e7-459c-9a4a-fb361bffad34/extract-utilities/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.977801 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p4vh_d9dc559e-59e7-459c-9a4a-fb361bffad34/extract-utilities/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.996002 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p4vh_d9dc559e-59e7-459c-9a4a-fb361bffad34/extract-content/0.log" Jan 23 11:05:32 crc kubenswrapper[4957]: I0123 11:05:32.997958 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p4vh_d9dc559e-59e7-459c-9a4a-fb361bffad34/extract-content/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.133788 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p4vh_d9dc559e-59e7-459c-9a4a-fb361bffad34/extract-content/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.139972 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p4vh_d9dc559e-59e7-459c-9a4a-fb361bffad34/extract-utilities/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.326579 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-pq4t4_564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8/marketplace-operator/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.330050 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-pq4t4_564e2b7f-db59-4c5e-bb9d-eaefadf2e1a8/marketplace-operator/1.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.376936 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2p4vh_d9dc559e-59e7-459c-9a4a-fb361bffad34/registry-server/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.492528 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xhhxg_9072b86d-252b-4804-ab47-be737d2a88ee/extract-utilities/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.634864 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xhhxg_9072b86d-252b-4804-ab47-be737d2a88ee/extract-content/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.635320 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xhhxg_9072b86d-252b-4804-ab47-be737d2a88ee/extract-content/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.660386 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xhhxg_9072b86d-252b-4804-ab47-be737d2a88ee/extract-utilities/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.805038 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xhhxg_9072b86d-252b-4804-ab47-be737d2a88ee/extract-utilities/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.809466 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xhhxg_9072b86d-252b-4804-ab47-be737d2a88ee/extract-content/0.log" Jan 23 11:05:33 crc kubenswrapper[4957]: I0123 11:05:33.963339 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xhhxg_9072b86d-252b-4804-ab47-be737d2a88ee/registry-server/0.log" Jan 23 11:05:44 crc kubenswrapper[4957]: I0123 11:05:44.434070 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-dz78b_36135589-afeb-4693-8919-41232243a809/prometheus-operator/0.log" Jan 23 11:05:44 crc kubenswrapper[4957]: I0123 11:05:44.457021 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8447f4d48-nbj4q_f84a2c7c-d1fc-43f2-8e87-30e51f367c23/prometheus-operator-admission-webhook/0.log" Jan 23 11:05:44 crc kubenswrapper[4957]: I0123 11:05:44.496315 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8447f4d48-4kbjr_5efeae78-a9ef-4b16-b3b9-f022a1ed43eb/prometheus-operator-admission-webhook/0.log" Jan 23 11:05:44 crc kubenswrapper[4957]: I0123 11:05:44.592553 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gl5gd_49486d1a-778e-4341-a8c7-a73763a29afc/operator/0.log" Jan 23 11:05:44 crc kubenswrapper[4957]: I0123 11:05:44.621617 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-75hhn_f2407120-3100-4d21-bc9c-2f305d00fc37/perses-operator/0.log" Jan 23 11:05:45 crc kubenswrapper[4957]: I0123 11:05:45.716907 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:05:45 crc kubenswrapper[4957]: I0123 11:05:45.716962 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:05:45 crc kubenswrapper[4957]: I0123 11:05:45.717012 4957 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 11:05:45 crc kubenswrapper[4957]: I0123 11:05:45.717581 4957 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16ac91441901705df207b992dbdbed5b8c06671e49d0ee3176372ea44a7ecdf1"} pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 23 11:05:45 crc kubenswrapper[4957]: I0123 11:05:45.717647 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" containerID="cri-o://16ac91441901705df207b992dbdbed5b8c06671e49d0ee3176372ea44a7ecdf1" gracePeriod=600 Jan 23 11:05:46 crc kubenswrapper[4957]: I0123 11:05:46.457493 4957 generic.go:334] "Generic (PLEG): container finished" podID="224e3211-1f68-4673-8975-7e71b1e513d0" containerID="16ac91441901705df207b992dbdbed5b8c06671e49d0ee3176372ea44a7ecdf1" exitCode=0 Jan 23 11:05:46 crc kubenswrapper[4957]: I0123 11:05:46.457585 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerDied","Data":"16ac91441901705df207b992dbdbed5b8c06671e49d0ee3176372ea44a7ecdf1"} Jan 23 11:05:46 crc kubenswrapper[4957]: I0123 11:05:46.458048 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"fbb8a9b1301ce5b3887b4c976fd329b3eaa92abd5c98e3db947c53dedb1ed1cd"} Jan 23 11:05:46 crc kubenswrapper[4957]: I0123 11:05:46.458074 4957 scope.go:117] "RemoveContainer" containerID="303ca83532cca99052083264393534af7ab68b89a646ac04970e5eb8fdb50844" Jan 23 11:06:35 crc kubenswrapper[4957]: I0123 11:06:35.764807 4957 generic.go:334] "Generic (PLEG): container finished" podID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerID="b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986" exitCode=0 Jan 23 11:06:35 crc kubenswrapper[4957]: I0123 11:06:35.764910 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t27bc/must-gather-72xck" event={"ID":"a22690bd-3600-4e3f-bb91-9194a1b0e425","Type":"ContainerDied","Data":"b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986"} Jan 23 11:06:35 crc kubenswrapper[4957]: I0123 11:06:35.765740 4957 scope.go:117] "RemoveContainer" containerID="b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986" Jan 23 11:06:36 crc kubenswrapper[4957]: I0123 11:06:36.710714 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t27bc_must-gather-72xck_a22690bd-3600-4e3f-bb91-9194a1b0e425/gather/0.log" Jan 23 11:06:43 crc kubenswrapper[4957]: I0123 11:06:43.691064 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t27bc/must-gather-72xck"] Jan 23 11:06:43 crc kubenswrapper[4957]: I0123 11:06:43.692892 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t27bc/must-gather-72xck" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerName="copy" containerID="cri-o://fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d" gracePeriod=2 Jan 23 11:06:43 crc kubenswrapper[4957]: I0123 11:06:43.696367 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t27bc/must-gather-72xck"] Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.772583 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t27bc_must-gather-72xck_a22690bd-3600-4e3f-bb91-9194a1b0e425/copy/0.log" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.773887 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.826093 4957 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t27bc_must-gather-72xck_a22690bd-3600-4e3f-bb91-9194a1b0e425/copy/0.log" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.826603 4957 generic.go:334] "Generic (PLEG): container finished" podID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerID="fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d" exitCode=143 Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.826662 4957 scope.go:117] "RemoveContainer" containerID="fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.826673 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t27bc/must-gather-72xck" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.843845 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2w7\" (UniqueName: \"kubernetes.io/projected/a22690bd-3600-4e3f-bb91-9194a1b0e425-kube-api-access-xn2w7\") pod \"a22690bd-3600-4e3f-bb91-9194a1b0e425\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.844108 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a22690bd-3600-4e3f-bb91-9194a1b0e425-must-gather-output\") pod \"a22690bd-3600-4e3f-bb91-9194a1b0e425\" (UID: \"a22690bd-3600-4e3f-bb91-9194a1b0e425\") " Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.849326 4957 scope.go:117] "RemoveContainer" containerID="b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.850305 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22690bd-3600-4e3f-bb91-9194a1b0e425-kube-api-access-xn2w7" (OuterVolumeSpecName: "kube-api-access-xn2w7") pod "a22690bd-3600-4e3f-bb91-9194a1b0e425" (UID: "a22690bd-3600-4e3f-bb91-9194a1b0e425"). InnerVolumeSpecName "kube-api-access-xn2w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.899116 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22690bd-3600-4e3f-bb91-9194a1b0e425-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a22690bd-3600-4e3f-bb91-9194a1b0e425" (UID: "a22690bd-3600-4e3f-bb91-9194a1b0e425"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.904110 4957 scope.go:117] "RemoveContainer" containerID="fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d" Jan 23 11:06:44 crc kubenswrapper[4957]: E0123 11:06:44.905191 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d\": container with ID starting with fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d not found: ID does not exist" containerID="fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.905227 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d"} err="failed to get container status \"fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d\": rpc error: code = NotFound desc = could not find container \"fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d\": container with ID starting with fc56c2cd2d0256128f7621070b2f84e9805674ec43ecb93d48bd5ecefb99116d not found: ID does not exist" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.905255 4957 scope.go:117] "RemoveContainer" containerID="b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986" Jan 23 11:06:44 crc kubenswrapper[4957]: E0123 11:06:44.905699 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986\": container with ID starting with b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986 not found: ID does not exist" containerID="b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.905720 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986"} err="failed to get container status \"b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986\": rpc error: code = NotFound desc = could not find container \"b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986\": container with ID starting with b5c2d7eeace9e2d708b6b671a9a4c5d2497fae1f7a6bdd294fb010a5137d1986 not found: ID does not exist" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.945740 4957 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a22690bd-3600-4e3f-bb91-9194a1b0e425-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 23 11:06:44 crc kubenswrapper[4957]: I0123 11:06:44.945775 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn2w7\" (UniqueName: \"kubernetes.io/projected/a22690bd-3600-4e3f-bb91-9194a1b0e425-kube-api-access-xn2w7\") on node \"crc\" DevicePath \"\"" Jan 23 11:06:46 crc kubenswrapper[4957]: I0123 11:06:46.776339 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" path="/var/lib/kubelet/pods/a22690bd-3600-4e3f-bb91-9194a1b0e425/volumes" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.344506 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c5rwr"] Jan 23 11:07:33 crc kubenswrapper[4957]: E0123 11:07:33.345330 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerName="gather" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.345345 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerName="gather" Jan 23 11:07:33 crc kubenswrapper[4957]: E0123 11:07:33.345359 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerName="copy" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.345366 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerName="copy" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.345471 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerName="copy" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.345490 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22690bd-3600-4e3f-bb91-9194a1b0e425" containerName="gather" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.346406 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.358617 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c5rwr"] Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.502229 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qww6r\" (UniqueName: \"kubernetes.io/projected/9f845eb7-4020-418c-976f-710e1d8e7764-kube-api-access-qww6r\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.502577 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-utilities\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.502704 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-catalog-content\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.604056 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-utilities\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.604132 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-catalog-content\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.604170 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qww6r\" (UniqueName: \"kubernetes.io/projected/9f845eb7-4020-418c-976f-710e1d8e7764-kube-api-access-qww6r\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.604599 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-utilities\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.604901 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-catalog-content\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.623318 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qww6r\" (UniqueName: \"kubernetes.io/projected/9f845eb7-4020-418c-976f-710e1d8e7764-kube-api-access-qww6r\") pod \"community-operators-c5rwr\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.702692 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:33 crc kubenswrapper[4957]: I0123 11:07:33.938668 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c5rwr"] Jan 23 11:07:34 crc kubenswrapper[4957]: I0123 11:07:34.146347 4957 generic.go:334] "Generic (PLEG): container finished" podID="9f845eb7-4020-418c-976f-710e1d8e7764" containerID="121d9b095d9b80d0f6f2626b6140381a20fe1167ae326a59b11eeb99bc3de0d6" exitCode=0 Jan 23 11:07:34 crc kubenswrapper[4957]: I0123 11:07:34.146386 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5rwr" event={"ID":"9f845eb7-4020-418c-976f-710e1d8e7764","Type":"ContainerDied","Data":"121d9b095d9b80d0f6f2626b6140381a20fe1167ae326a59b11eeb99bc3de0d6"} Jan 23 11:07:34 crc kubenswrapper[4957]: I0123 11:07:34.146408 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5rwr" event={"ID":"9f845eb7-4020-418c-976f-710e1d8e7764","Type":"ContainerStarted","Data":"d7d9ed8d7a9cb1b681ac4a564eb1cece1873d704effa1b4e5076aa961a3b670f"} Jan 23 11:07:34 crc kubenswrapper[4957]: I0123 11:07:34.147974 4957 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 23 11:07:35 crc kubenswrapper[4957]: I0123 11:07:35.153021 4957 generic.go:334] "Generic (PLEG): container finished" podID="9f845eb7-4020-418c-976f-710e1d8e7764" containerID="28aad0918ad476730cf105f51a7bf2440368694a0617b701759c8400520a8bac" exitCode=0 Jan 23 11:07:35 crc kubenswrapper[4957]: I0123 11:07:35.153081 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5rwr" event={"ID":"9f845eb7-4020-418c-976f-710e1d8e7764","Type":"ContainerDied","Data":"28aad0918ad476730cf105f51a7bf2440368694a0617b701759c8400520a8bac"} Jan 23 11:07:36 crc kubenswrapper[4957]: I0123 11:07:36.161195 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5rwr" event={"ID":"9f845eb7-4020-418c-976f-710e1d8e7764","Type":"ContainerStarted","Data":"546f939f3aea5dc9b2f1e4ee98d71b7cc1ba3ced8bee2924fdd429c98af624c2"} Jan 23 11:07:36 crc kubenswrapper[4957]: I0123 11:07:36.186743 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c5rwr" podStartSLOduration=1.708931744 podStartE2EDuration="3.186724333s" podCreationTimestamp="2026-01-23 11:07:33 +0000 UTC" firstStartedPulling="2026-01-23 11:07:34.14768691 +0000 UTC m=+963.684939617" lastFinishedPulling="2026-01-23 11:07:35.625479529 +0000 UTC m=+965.162732206" observedRunningTime="2026-01-23 11:07:36.180877751 +0000 UTC m=+965.718130448" watchObservedRunningTime="2026-01-23 11:07:36.186724333 +0000 UTC m=+965.723977040" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.103539 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sl2hr"] Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.109376 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.130234 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sl2hr"] Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.271855 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcnz2\" (UniqueName: \"kubernetes.io/projected/e3c5b3fa-4657-45db-ad53-8a48be03edfa-kube-api-access-hcnz2\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.272478 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-utilities\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.272611 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-catalog-content\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.374560 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcnz2\" (UniqueName: \"kubernetes.io/projected/e3c5b3fa-4657-45db-ad53-8a48be03edfa-kube-api-access-hcnz2\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.374610 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-utilities\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.374631 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-catalog-content\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.375809 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-catalog-content\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.376099 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-utilities\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.397413 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcnz2\" (UniqueName: \"kubernetes.io/projected/e3c5b3fa-4657-45db-ad53-8a48be03edfa-kube-api-access-hcnz2\") pod \"redhat-operators-sl2hr\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.435802 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:39 crc kubenswrapper[4957]: I0123 11:07:39.831990 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sl2hr"] Jan 23 11:07:39 crc kubenswrapper[4957]: W0123 11:07:39.844458 4957 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c5b3fa_4657_45db_ad53_8a48be03edfa.slice/crio-0e5f3290f7f0034dd066087fb778531e597a019f0221fa4a915d741cc3fe466f WatchSource:0}: Error finding container 0e5f3290f7f0034dd066087fb778531e597a019f0221fa4a915d741cc3fe466f: Status 404 returned error can't find the container with id 0e5f3290f7f0034dd066087fb778531e597a019f0221fa4a915d741cc3fe466f Jan 23 11:07:40 crc kubenswrapper[4957]: I0123 11:07:40.186210 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sl2hr" event={"ID":"e3c5b3fa-4657-45db-ad53-8a48be03edfa","Type":"ContainerStarted","Data":"0e5f3290f7f0034dd066087fb778531e597a019f0221fa4a915d741cc3fe466f"} Jan 23 11:07:42 crc kubenswrapper[4957]: I0123 11:07:42.206226 4957 generic.go:334] "Generic (PLEG): container finished" podID="e3c5b3fa-4657-45db-ad53-8a48be03edfa" containerID="405a0e7ea4810fb11fbffef72d629a606ea2f1eefb24c38bfa732f7aedb06b8f" exitCode=0 Jan 23 11:07:42 crc kubenswrapper[4957]: I0123 11:07:42.206320 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sl2hr" event={"ID":"e3c5b3fa-4657-45db-ad53-8a48be03edfa","Type":"ContainerDied","Data":"405a0e7ea4810fb11fbffef72d629a606ea2f1eefb24c38bfa732f7aedb06b8f"} Jan 23 11:07:43 crc kubenswrapper[4957]: I0123 11:07:43.703225 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:43 crc kubenswrapper[4957]: I0123 11:07:43.703765 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:43 crc kubenswrapper[4957]: I0123 11:07:43.747860 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:44 crc kubenswrapper[4957]: I0123 11:07:44.220945 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sl2hr" event={"ID":"e3c5b3fa-4657-45db-ad53-8a48be03edfa","Type":"ContainerStarted","Data":"fc5b80261c29a2a3af250d7fbd7d7ab876b6bc60238f199636c41101fd390fc9"} Jan 23 11:07:44 crc kubenswrapper[4957]: I0123 11:07:44.259953 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:45 crc kubenswrapper[4957]: I0123 11:07:45.227986 4957 generic.go:334] "Generic (PLEG): container finished" podID="e3c5b3fa-4657-45db-ad53-8a48be03edfa" containerID="fc5b80261c29a2a3af250d7fbd7d7ab876b6bc60238f199636c41101fd390fc9" exitCode=0 Jan 23 11:07:45 crc kubenswrapper[4957]: I0123 11:07:45.228078 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sl2hr" event={"ID":"e3c5b3fa-4657-45db-ad53-8a48be03edfa","Type":"ContainerDied","Data":"fc5b80261c29a2a3af250d7fbd7d7ab876b6bc60238f199636c41101fd390fc9"} Jan 23 11:07:45 crc kubenswrapper[4957]: I0123 11:07:45.523513 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c5rwr"] Jan 23 11:07:45 crc kubenswrapper[4957]: I0123 11:07:45.716871 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:07:45 crc kubenswrapper[4957]: I0123 11:07:45.717177 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:07:46 crc kubenswrapper[4957]: I0123 11:07:46.237898 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c5rwr" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="registry-server" containerID="cri-o://546f939f3aea5dc9b2f1e4ee98d71b7cc1ba3ced8bee2924fdd429c98af624c2" gracePeriod=2 Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.256242 4957 generic.go:334] "Generic (PLEG): container finished" podID="9f845eb7-4020-418c-976f-710e1d8e7764" containerID="546f939f3aea5dc9b2f1e4ee98d71b7cc1ba3ced8bee2924fdd429c98af624c2" exitCode=0 Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.256329 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5rwr" event={"ID":"9f845eb7-4020-418c-976f-710e1d8e7764","Type":"ContainerDied","Data":"546f939f3aea5dc9b2f1e4ee98d71b7cc1ba3ced8bee2924fdd429c98af624c2"} Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.554273 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.703532 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-utilities\") pod \"9f845eb7-4020-418c-976f-710e1d8e7764\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.703628 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-catalog-content\") pod \"9f845eb7-4020-418c-976f-710e1d8e7764\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.703708 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qww6r\" (UniqueName: \"kubernetes.io/projected/9f845eb7-4020-418c-976f-710e1d8e7764-kube-api-access-qww6r\") pod \"9f845eb7-4020-418c-976f-710e1d8e7764\" (UID: \"9f845eb7-4020-418c-976f-710e1d8e7764\") " Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.704890 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-utilities" (OuterVolumeSpecName: "utilities") pod "9f845eb7-4020-418c-976f-710e1d8e7764" (UID: "9f845eb7-4020-418c-976f-710e1d8e7764"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.716578 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f845eb7-4020-418c-976f-710e1d8e7764-kube-api-access-qww6r" (OuterVolumeSpecName: "kube-api-access-qww6r") pod "9f845eb7-4020-418c-976f-710e1d8e7764" (UID: "9f845eb7-4020-418c-976f-710e1d8e7764"). InnerVolumeSpecName "kube-api-access-qww6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.750947 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f845eb7-4020-418c-976f-710e1d8e7764" (UID: "9f845eb7-4020-418c-976f-710e1d8e7764"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.805503 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.805719 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f845eb7-4020-418c-976f-710e1d8e7764-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 11:07:48 crc kubenswrapper[4957]: I0123 11:07:48.805825 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qww6r\" (UniqueName: \"kubernetes.io/projected/9f845eb7-4020-418c-976f-710e1d8e7764-kube-api-access-qww6r\") on node \"crc\" DevicePath \"\"" Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.265087 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c5rwr" event={"ID":"9f845eb7-4020-418c-976f-710e1d8e7764","Type":"ContainerDied","Data":"d7d9ed8d7a9cb1b681ac4a564eb1cece1873d704effa1b4e5076aa961a3b670f"} Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.265160 4957 scope.go:117] "RemoveContainer" containerID="546f939f3aea5dc9b2f1e4ee98d71b7cc1ba3ced8bee2924fdd429c98af624c2" Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.265377 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c5rwr" Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.270799 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sl2hr" event={"ID":"e3c5b3fa-4657-45db-ad53-8a48be03edfa","Type":"ContainerStarted","Data":"6b12105869273aa8f36b7f57c0d8851c3a1b9fda8a5e99baaac477d286039eac"} Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.285766 4957 scope.go:117] "RemoveContainer" containerID="28aad0918ad476730cf105f51a7bf2440368694a0617b701759c8400520a8bac" Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.294909 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sl2hr" podStartSLOduration=4.190252755 podStartE2EDuration="10.29489508s" podCreationTimestamp="2026-01-23 11:07:39 +0000 UTC" firstStartedPulling="2026-01-23 11:07:42.207673781 +0000 UTC m=+971.744926468" lastFinishedPulling="2026-01-23 11:07:48.312316086 +0000 UTC m=+977.849568793" observedRunningTime="2026-01-23 11:07:49.29237205 +0000 UTC m=+978.829624777" watchObservedRunningTime="2026-01-23 11:07:49.29489508 +0000 UTC m=+978.832147767" Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.307881 4957 scope.go:117] "RemoveContainer" containerID="121d9b095d9b80d0f6f2626b6140381a20fe1167ae326a59b11eeb99bc3de0d6" Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.326509 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c5rwr"] Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.333649 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c5rwr"] Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.436259 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:49 crc kubenswrapper[4957]: I0123 11:07:49.439444 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:50 crc kubenswrapper[4957]: I0123 11:07:50.480961 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sl2hr" podUID="e3c5b3fa-4657-45db-ad53-8a48be03edfa" containerName="registry-server" probeResult="failure" output=< Jan 23 11:07:50 crc kubenswrapper[4957]: timeout: failed to connect service ":50051" within 1s Jan 23 11:07:50 crc kubenswrapper[4957]: > Jan 23 11:07:50 crc kubenswrapper[4957]: I0123 11:07:50.776345 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" path="/var/lib/kubelet/pods/9f845eb7-4020-418c-976f-710e1d8e7764/volumes" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.333900 4957 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qw9nw"] Jan 23 11:07:53 crc kubenswrapper[4957]: E0123 11:07:53.334516 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="extract-content" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.334533 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="extract-content" Jan 23 11:07:53 crc kubenswrapper[4957]: E0123 11:07:53.334568 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="registry-server" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.334576 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="registry-server" Jan 23 11:07:53 crc kubenswrapper[4957]: E0123 11:07:53.334592 4957 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="extract-utilities" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.334599 4957 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="extract-utilities" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.334758 4957 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f845eb7-4020-418c-976f-710e1d8e7764" containerName="registry-server" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.335882 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.352238 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw9nw"] Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.465660 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ldf7\" (UniqueName: \"kubernetes.io/projected/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-kube-api-access-4ldf7\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.465946 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-catalog-content\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.466109 4957 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-utilities\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.567315 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-utilities\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.567389 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ldf7\" (UniqueName: \"kubernetes.io/projected/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-kube-api-access-4ldf7\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.567429 4957 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-catalog-content\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.567991 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-catalog-content\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.568042 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-utilities\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.589212 4957 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ldf7\" (UniqueName: \"kubernetes.io/projected/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-kube-api-access-4ldf7\") pod \"certified-operators-qw9nw\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:53 crc kubenswrapper[4957]: I0123 11:07:53.655013 4957 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:07:54 crc kubenswrapper[4957]: I0123 11:07:54.004990 4957 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw9nw"] Jan 23 11:07:54 crc kubenswrapper[4957]: I0123 11:07:54.298716 4957 generic.go:334] "Generic (PLEG): container finished" podID="9c8ab07f-a307-47b1-a429-2d3d3a4889f2" containerID="0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139" exitCode=0 Jan 23 11:07:54 crc kubenswrapper[4957]: I0123 11:07:54.298806 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw9nw" event={"ID":"9c8ab07f-a307-47b1-a429-2d3d3a4889f2","Type":"ContainerDied","Data":"0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139"} Jan 23 11:07:54 crc kubenswrapper[4957]: I0123 11:07:54.299044 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw9nw" event={"ID":"9c8ab07f-a307-47b1-a429-2d3d3a4889f2","Type":"ContainerStarted","Data":"b62d7a4300403a524a5d3ce4d1946c879f81953383be151a0c7d227c4fa1bdd7"} Jan 23 11:07:58 crc kubenswrapper[4957]: I0123 11:07:58.325058 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw9nw" event={"ID":"9c8ab07f-a307-47b1-a429-2d3d3a4889f2","Type":"ContainerStarted","Data":"7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f"} Jan 23 11:07:59 crc kubenswrapper[4957]: I0123 11:07:59.334938 4957 generic.go:334] "Generic (PLEG): container finished" podID="9c8ab07f-a307-47b1-a429-2d3d3a4889f2" containerID="7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f" exitCode=0 Jan 23 11:07:59 crc kubenswrapper[4957]: I0123 11:07:59.334976 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw9nw" event={"ID":"9c8ab07f-a307-47b1-a429-2d3d3a4889f2","Type":"ContainerDied","Data":"7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f"} Jan 23 11:07:59 crc kubenswrapper[4957]: I0123 11:07:59.496366 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:07:59 crc kubenswrapper[4957]: I0123 11:07:59.533824 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:08:00 crc kubenswrapper[4957]: I0123 11:08:00.581684 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sl2hr"] Jan 23 11:08:01 crc kubenswrapper[4957]: I0123 11:08:01.351862 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sl2hr" podUID="e3c5b3fa-4657-45db-ad53-8a48be03edfa" containerName="registry-server" containerID="cri-o://6b12105869273aa8f36b7f57c0d8851c3a1b9fda8a5e99baaac477d286039eac" gracePeriod=2 Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.361829 4957 generic.go:334] "Generic (PLEG): container finished" podID="e3c5b3fa-4657-45db-ad53-8a48be03edfa" containerID="6b12105869273aa8f36b7f57c0d8851c3a1b9fda8a5e99baaac477d286039eac" exitCode=0 Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.362052 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sl2hr" event={"ID":"e3c5b3fa-4657-45db-ad53-8a48be03edfa","Type":"ContainerDied","Data":"6b12105869273aa8f36b7f57c0d8851c3a1b9fda8a5e99baaac477d286039eac"} Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.364713 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw9nw" event={"ID":"9c8ab07f-a307-47b1-a429-2d3d3a4889f2","Type":"ContainerStarted","Data":"8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698"} Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.384705 4957 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qw9nw" podStartSLOduration=1.841665115 podStartE2EDuration="9.384685948s" podCreationTimestamp="2026-01-23 11:07:53 +0000 UTC" firstStartedPulling="2026-01-23 11:07:54.300600931 +0000 UTC m=+983.837853628" lastFinishedPulling="2026-01-23 11:08:01.843621774 +0000 UTC m=+991.380874461" observedRunningTime="2026-01-23 11:08:02.380211833 +0000 UTC m=+991.917464530" watchObservedRunningTime="2026-01-23 11:08:02.384685948 +0000 UTC m=+991.921938635" Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.436767 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.506577 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-catalog-content\") pod \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.506651 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-utilities\") pod \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.506702 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcnz2\" (UniqueName: \"kubernetes.io/projected/e3c5b3fa-4657-45db-ad53-8a48be03edfa-kube-api-access-hcnz2\") pod \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\" (UID: \"e3c5b3fa-4657-45db-ad53-8a48be03edfa\") " Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.507539 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-utilities" (OuterVolumeSpecName: "utilities") pod "e3c5b3fa-4657-45db-ad53-8a48be03edfa" (UID: "e3c5b3fa-4657-45db-ad53-8a48be03edfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.513001 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c5b3fa-4657-45db-ad53-8a48be03edfa-kube-api-access-hcnz2" (OuterVolumeSpecName: "kube-api-access-hcnz2") pod "e3c5b3fa-4657-45db-ad53-8a48be03edfa" (UID: "e3c5b3fa-4657-45db-ad53-8a48be03edfa"). InnerVolumeSpecName "kube-api-access-hcnz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.608370 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 11:08:02 crc kubenswrapper[4957]: I0123 11:08:02.608400 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcnz2\" (UniqueName: \"kubernetes.io/projected/e3c5b3fa-4657-45db-ad53-8a48be03edfa-kube-api-access-hcnz2\") on node \"crc\" DevicePath \"\"" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.297015 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3c5b3fa-4657-45db-ad53-8a48be03edfa" (UID: "e3c5b3fa-4657-45db-ad53-8a48be03edfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.316163 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c5b3fa-4657-45db-ad53-8a48be03edfa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.376715 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sl2hr" event={"ID":"e3c5b3fa-4657-45db-ad53-8a48be03edfa","Type":"ContainerDied","Data":"0e5f3290f7f0034dd066087fb778531e597a019f0221fa4a915d741cc3fe466f"} Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.376783 4957 scope.go:117] "RemoveContainer" containerID="6b12105869273aa8f36b7f57c0d8851c3a1b9fda8a5e99baaac477d286039eac" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.377361 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sl2hr" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.395578 4957 scope.go:117] "RemoveContainer" containerID="fc5b80261c29a2a3af250d7fbd7d7ab876b6bc60238f199636c41101fd390fc9" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.416402 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sl2hr"] Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.419636 4957 scope.go:117] "RemoveContainer" containerID="405a0e7ea4810fb11fbffef72d629a606ea2f1eefb24c38bfa732f7aedb06b8f" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.423981 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sl2hr"] Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.656506 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:08:03 crc kubenswrapper[4957]: I0123 11:08:03.656564 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:08:04 crc kubenswrapper[4957]: I0123 11:08:04.700432 4957 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qw9nw" podUID="9c8ab07f-a307-47b1-a429-2d3d3a4889f2" containerName="registry-server" probeResult="failure" output=< Jan 23 11:08:04 crc kubenswrapper[4957]: timeout: failed to connect service ":50051" within 1s Jan 23 11:08:04 crc kubenswrapper[4957]: > Jan 23 11:08:04 crc kubenswrapper[4957]: I0123 11:08:04.784947 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c5b3fa-4657-45db-ad53-8a48be03edfa" path="/var/lib/kubelet/pods/e3c5b3fa-4657-45db-ad53-8a48be03edfa/volumes" Jan 23 11:08:13 crc kubenswrapper[4957]: I0123 11:08:13.705241 4957 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:08:13 crc kubenswrapper[4957]: I0123 11:08:13.759745 4957 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:08:13 crc kubenswrapper[4957]: I0123 11:08:13.937440 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qw9nw"] Jan 23 11:08:15 crc kubenswrapper[4957]: I0123 11:08:15.463856 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qw9nw" podUID="9c8ab07f-a307-47b1-a429-2d3d3a4889f2" containerName="registry-server" containerID="cri-o://8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698" gracePeriod=2 Jan 23 11:08:15 crc kubenswrapper[4957]: I0123 11:08:15.717081 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:08:15 crc kubenswrapper[4957]: I0123 11:08:15.717151 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.342991 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.414211 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-catalog-content\") pod \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.414312 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-utilities\") pod \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.414345 4957 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ldf7\" (UniqueName: \"kubernetes.io/projected/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-kube-api-access-4ldf7\") pod \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\" (UID: \"9c8ab07f-a307-47b1-a429-2d3d3a4889f2\") " Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.426995 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-utilities" (OuterVolumeSpecName: "utilities") pod "9c8ab07f-a307-47b1-a429-2d3d3a4889f2" (UID: "9c8ab07f-a307-47b1-a429-2d3d3a4889f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.438774 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-kube-api-access-4ldf7" (OuterVolumeSpecName: "kube-api-access-4ldf7") pod "9c8ab07f-a307-47b1-a429-2d3d3a4889f2" (UID: "9c8ab07f-a307-47b1-a429-2d3d3a4889f2"). InnerVolumeSpecName "kube-api-access-4ldf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.459104 4957 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c8ab07f-a307-47b1-a429-2d3d3a4889f2" (UID: "9c8ab07f-a307-47b1-a429-2d3d3a4889f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.470535 4957 generic.go:334] "Generic (PLEG): container finished" podID="9c8ab07f-a307-47b1-a429-2d3d3a4889f2" containerID="8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698" exitCode=0 Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.470572 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw9nw" event={"ID":"9c8ab07f-a307-47b1-a429-2d3d3a4889f2","Type":"ContainerDied","Data":"8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698"} Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.470596 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw9nw" event={"ID":"9c8ab07f-a307-47b1-a429-2d3d3a4889f2","Type":"ContainerDied","Data":"b62d7a4300403a524a5d3ce4d1946c879f81953383be151a0c7d227c4fa1bdd7"} Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.470612 4957 scope.go:117] "RemoveContainer" containerID="8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.470611 4957 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw9nw" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.488484 4957 scope.go:117] "RemoveContainer" containerID="7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.502419 4957 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qw9nw"] Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.506472 4957 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qw9nw"] Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.515642 4957 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-utilities\") on node \"crc\" DevicePath \"\"" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.515676 4957 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ldf7\" (UniqueName: \"kubernetes.io/projected/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-kube-api-access-4ldf7\") on node \"crc\" DevicePath \"\"" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.515685 4957 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c8ab07f-a307-47b1-a429-2d3d3a4889f2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.527384 4957 scope.go:117] "RemoveContainer" containerID="0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.543494 4957 scope.go:117] "RemoveContainer" containerID="8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698" Jan 23 11:08:16 crc kubenswrapper[4957]: E0123 11:08:16.543855 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698\": container with ID starting with 8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698 not found: ID does not exist" containerID="8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.543914 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698"} err="failed to get container status \"8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698\": rpc error: code = NotFound desc = could not find container \"8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698\": container with ID starting with 8f97b3720c2e4e6124debaa852a8169a19ae7e6978e3e03c7c6cf3166d1d8698 not found: ID does not exist" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.543938 4957 scope.go:117] "RemoveContainer" containerID="7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f" Jan 23 11:08:16 crc kubenswrapper[4957]: E0123 11:08:16.544231 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f\": container with ID starting with 7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f not found: ID does not exist" containerID="7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.544260 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f"} err="failed to get container status \"7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f\": rpc error: code = NotFound desc = could not find container \"7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f\": container with ID starting with 7af6c276693f9eab0e61c25590182bc454d108a1e3a07d1c2e897f0fe7c23a2f not found: ID does not exist" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.544308 4957 scope.go:117] "RemoveContainer" containerID="0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139" Jan 23 11:08:16 crc kubenswrapper[4957]: E0123 11:08:16.544605 4957 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139\": container with ID starting with 0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139 not found: ID does not exist" containerID="0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.544650 4957 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139"} err="failed to get container status \"0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139\": rpc error: code = NotFound desc = could not find container \"0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139\": container with ID starting with 0bfc8fd37a0574141fe4c2ca2b8182094b9e83361615c7a88ab2b986c9563139 not found: ID does not exist" Jan 23 11:08:16 crc kubenswrapper[4957]: I0123 11:08:16.777570 4957 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c8ab07f-a307-47b1-a429-2d3d3a4889f2" path="/var/lib/kubelet/pods/9c8ab07f-a307-47b1-a429-2d3d3a4889f2/volumes" Jan 23 11:08:45 crc kubenswrapper[4957]: I0123 11:08:45.716999 4957 patch_prober.go:28] interesting pod/machine-config-daemon-w2xjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 23 11:08:45 crc kubenswrapper[4957]: I0123 11:08:45.717590 4957 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 23 11:08:45 crc kubenswrapper[4957]: I0123 11:08:45.717650 4957 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" Jan 23 11:08:45 crc kubenswrapper[4957]: I0123 11:08:45.718285 4957 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fbb8a9b1301ce5b3887b4c976fd329b3eaa92abd5c98e3db947c53dedb1ed1cd"} pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 23 11:08:45 crc kubenswrapper[4957]: I0123 11:08:45.718354 4957 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" podUID="224e3211-1f68-4673-8975-7e71b1e513d0" containerName="machine-config-daemon" containerID="cri-o://fbb8a9b1301ce5b3887b4c976fd329b3eaa92abd5c98e3db947c53dedb1ed1cd" gracePeriod=600 Jan 23 11:08:46 crc kubenswrapper[4957]: I0123 11:08:46.675120 4957 generic.go:334] "Generic (PLEG): container finished" podID="224e3211-1f68-4673-8975-7e71b1e513d0" containerID="fbb8a9b1301ce5b3887b4c976fd329b3eaa92abd5c98e3db947c53dedb1ed1cd" exitCode=0 Jan 23 11:08:46 crc kubenswrapper[4957]: I0123 11:08:46.675363 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerDied","Data":"fbb8a9b1301ce5b3887b4c976fd329b3eaa92abd5c98e3db947c53dedb1ed1cd"} Jan 23 11:08:46 crc kubenswrapper[4957]: I0123 11:08:46.675554 4957 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2xjv" event={"ID":"224e3211-1f68-4673-8975-7e71b1e513d0","Type":"ContainerStarted","Data":"7cd3b576a484922b8f3fa38e12ac9a048913f4d4b79badbf15a4232c117e18a5"} Jan 23 11:08:46 crc kubenswrapper[4957]: I0123 11:08:46.675589 4957 scope.go:117] "RemoveContainer" containerID="16ac91441901705df207b992dbdbed5b8c06671e49d0ee3176372ea44a7ecdf1" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515134653322024451 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015134653323017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015134650767016523 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015134650767015473 5ustar corecore